Refine Your Search

Search Results

Viewing 1 to 2 of 2
Technical Paper

Lane Detection under Low-illumination Condition by Enhanced Feature Learning

2022-12-22
2022-01-7102
In the fusion-based vehicle positioning, the lane detection is applied to provide the relative position of ego-vehicle and lanes, which is critical to subsequent tasks including trajectory planning and behaviour decision. However, the performance of current vision-based lane detectors drop significantly when facing adversarial visual conditions, e.g., low-illumination conditions like night scenarios. Images captured in this scenario often suffer from low contrast, low brightness and noise, which is challenging for detectors to extract correct information. To facilitate the lane detection in low-illumination conditions, this paper presents a novel framework which integrates image feature enhancement with lane detection. The framework consists of two modules: an image enhancement module to enhance and extract information from low visibility images, and a detection module to regress the lane parameters. Both modules are optimized by loss collaboration.
Technical Paper

Radar and Smart Camera Based Data Fusion for Multiple Vehicle Tracking System in Autonomous Driving

2022-03-31
2022-01-7019
In advanced driver assistance systems (ADAS) or autonomous driving Systems (ADS) the robust and reliable perception of the environment, especially for the detecting and tracking the surrounding vehicle is prerequisite for collision warning and collision avoidance. In this paper a post-fusion tracking approach is presented which combines the front view Radar observation and front smart camera information. The approach can improve the tracking accuracy of the tracking system to support ADAS or ADS function such as adaptive cruise control (ACC) or autonomous emergency braking (AEB). The paper describes the state estimation algorithm, data association in the fusion architecture. Furthermore, the fusion architecture is tested and validated in real highway driving scenario.
X