Lane Detection under Low-illumination Condition by Enhanced Feature Learning 2022-01-7102
In the fusion-based vehicle positioning, the lane detection is applied to provide the relative position of ego-vehicle and lanes, which is critical to subsequent tasks including trajectory planning and behaviour decision. However, the performance of current vision-based lane detectors drop significantly when facing adversarial visual conditions, e.g., low-illumination conditions like night scenarios. Images captured in this scenario often suffer from low contrast, low brightness and noise, which is challenging for detectors to extract correct information. To facilitate the lane detection in low-illumination conditions, this paper presents a novel framework which integrates image feature enhancement with lane detection. The framework consists of two modules: an image enhancement module to enhance and extract information from low visibility images, and a detection module to regress the lane parameters. Both modules are optimized by loss collaboration. Experiments on popular lane detection benchmark showed that the proposed method achieved significant and consistent improvements. With our framework implemented, the performance of two SOTA detectors increases by 1.32% and 0.67% in low-illumination conditions.
Citation: Zhao, W., Tian, W., Lu, C., and Yu, X., "Lane Detection under Low-illumination Condition by Enhanced Feature Learning," SAE Technical Paper 2022-01-7102, 2022, https://doi.org/10.4271/2022-01-7102. Download Citation