Identifying Cut-In Vehicles by Fusing Radar and Vision Data for Truck Platooning Safety 2020-01-0102
Truck platooning systems (Level 2) extend the radar, camera and vehicle-to-vehicle communications based, cooperative adaptive cruise control to provide precise automated lateral and longitudinal vehicle control in order to maintain a tight formation of vehicles with short following distances. A manually driven truck leads a platoon, while the steering, acceleration and braking of following truck(s) are automatically controlled with the driver(s) engaged, monitoring system performance and the driving environment at all times. Level 1 truck platooning has already demonstrated the potential for significant fuel savings, enhanced mobility, and associated emissions reductions from platooning vehicles. Level 2 automation may increase these benefits while reducing driver workload and increasing safety.
The objective of this work is to address fundamental problems that arise when vehicles present in the adjacent lanes of the platooning vehicles maneuver to change lanes and cut into the gap between the platooning trucks. The sensing systems in the platooning trucks must be able to identify and estimate the state of these "cut-in" vehicles in real-time so that suitable control actions can be taken. The current truck platooning system developed at the Texas Transportation Institute uses a forward looking scanning radar (Delphi), a forward looking camera (MobilEye) and Dedicated Short Range Communication (DSRC) information for situation awareness. In this work, we develop a sensor fusion algorithm that combines radar, camera and inertial data to provide real-time estimates of any cut-in vehicle present in the field of view of these sensors. Development of these algorithms can significantly increase the situational awareness of the platooning system, and improve the safety and performance of the overall system.
Mengke Liu, Sivakumar Rathinam, Mike Lukuc, Swaminathan Gopalswamy
Texas A & M University, College Station