Data-driven Object Detection Confidence Model for ADAS 2020-01-0695
The majority of road accident is due to human error. Advanced Driver Assistance System (ADAS) have the potential to reduce human error and therefore improve driving safety and comfort. Object-detection is a critical task for the ADAS perception system. On one hand, false-negative can cause accidents; on the other hand, false-positive can result in ghost-braking and harm the driving experience. Different sensors, such as radar and camera, are typically combined to achieve higher robustness and accuracy. Using large amount of information from different sources to create a confidence mode and determine the validity of the detected-objects have not been much studied in the literature.
In this paper, we propose a data-driven method which combines various information, such as radar reflection power and camera detection quality, to produce a unified confidence model. In addition, different region regarding the ego vehicle usually has different requirements for detection error based on ADAS functions. And thus, different confidence thresholds are used base the region of interest. The proposed method was validated with real-world driving data and shown a higher performance based on the region of interest.
Hang Yang, Darui Zhang, Daihan Wang, Jianguang Zhou
Dong Feng Engineering & Technical Center