A HiL Test Bench for Monocular Vision Sensors and its Applications in Camera-Only AEBs 2019-01-0881
Advanced Driver Assistance System (ADAS) uses a class of environmental sensors, e.g. LiDAR, RADAR, cameras, to obtain information about surrounding traffic objects. By processing this information in real time, the control algorithm in the ADAS decision-making level then makes decisions according to different warning, assistance, and/or motion-planning strategies. The executive layer responds to the target commands to finally implement different ADAS functions.
Theoretically, the vision sensor alone, which is usually mounted on the interior of the front windshield of a vehicle, can support the system requirements of ADAS applications in many directions. Among many types of vision sensors, the monocular vision sensor has become the first choice for most ADAS products due to its lower cost, higher reliability, and easier installation for calibration.
This paper describes the design and establishment of a Hardware-in-the-Loop (HiL) test bench for monocular vision sensors. The virtual test scenarios are built in the CarSim software. The animated pictures in accordance with the position and direction of the vision sensor installation are simulated in a host computer and projected at actual size onto a 120-degree circular screen via 3 high-definition projectors. By shooting the screen, the monocular vision sensor detects objects in the pictures and then transmits the instance-level information through the Controller-Area Network (CAN) bus to a real-time target, where the models of the vehicle dynamics and the ADAS controllers are executed time-critically. A Graphical User Interface (GUI) running in the host computer is created to connect, control, and monitor the real-time target. The animation scene displayed by the host computer is updated according to the calculated vehicle states received from the real-time target. A closed-loop system is thus formed.
Autonomous Emergency Braking system (AEBs) is an active-safety-related ADAS technology that has the potential to prevent or mitigate a crash. Based on the aforementioned HiL test bench, this paper studies the motion-sensing performance and characteristics of a monocular vision sensor in the vehicle longitudinal (X) direction when a camera-only AEBs performs emergency braking. By following the C-NCAP test procedure (2018 edition), the AEB tests in the Car-to-Car-Rear (CCR) scenario are conducted. Moreover, an “ideal” control algorithm of the AEBs composed of two execution phases, i.e. the high-speed braking (at 30kph or more) and the low-speed braking (at less than 30kph), is established and running in the real-time target by accepting the object-detection information from the vision sensor. By comparing and analyzing the difference between the measured data of the physical sensor and the ground truth of the CarSim software, the motion-sensing performance and characteristics of the monocular vision sensor is obtained, together with a matching and optimization solution for performance enhancement of the camera-only AEBs.
Pan Song, Rui Fang, Bolin Gao, Dongchao Wei