Refine Your Search

Search Results

Viewing 1 to 4 of 4
Technical Paper

Robust Sensor Fused Object Detection Using Convolutional Neural Networks for Autonomous Vehicles

2020-04-14
2020-01-0100
Environmental perception is considered an essential module for autonomous driving and Advanced Driver Assistance System (ADAS). Recently, deep Convolutional Neural Networks (CNNs) have become the State-of-the-Art with many different architectures in various object detection problems. However, performances of existing CNNs have been dropping when detecting small objects at a large distance. To deploy any environmental perception system in real world applications, it is important that the system achieves high accuracy regardless of the size of the object, distance, and weather conditions. In this paper, a robust sensor fused object detection system is proposed by utilizing the advantages of both vision and automotive radar sensors. The proposed system consists of three major components: 1) the Coordinate Conversion module, 2) Multi level-Sensor Fusion Detection (MSFD) system, and 3) Temporal Correlation filtering module.
Technical Paper

A Robust Failure Proof Driver Drowsiness Detection System Estimating Blink and Yawn

2020-04-14
2020-01-1030
The fatal automobile accidents can be attributed to fatigued and distracted driving by drivers. Driver Monitoring Systems alert the distracted drivers by raising alarms. Most of the image based driver drowsiness detection systems face the challenge of failure proof performance in real time applications. Failure in face detection and other important part (eyes, nose and mouth) detections in real time cause the system to skip detections of blinking and yawning in few frames. In this paper, a real time robust and failure proof driver drowsiness detection system is proposed. The proposed system deploys a set of detection systems to detect face, blinking and yawning sequentially. A robust Multi-Task Convolutional Neural Network (MTCNN) with the capability of face alignment is used for face detection. This system attained 97% recall in the real time driving dataset collected. The detected face is passed on to ensemble of regression trees to detect the 68 facial landmarks.
Journal Article

Lane Line Detection by LiDAR Intensity Value Interpolation

2019-10-22
2019-01-2607
Lane marks are an important aspect for autonomous driving. Autonomous vehicles rely on lane mark information to determine a safe and legal path to drive. In this paper an approach to estimate lane lines on straight or slightly curved roads using a LiDAR unit for autonomous vehicles is presented. By comparing the difference in elevation of LiDAR channels, a drivable region is defined. The presented approach used in this paper differs from previous LiDAR lane line detection methods by reducing the drivable region from three to two dimensions exploring only the x-y trace. In addition, potential lane markings are extracted by filtering a range of intensity values as opposed to the traditional approach of comparing neighboring intensity values. Further, by calculating the standard deviation of the potential lane markings in the y-axis, the data can be further refined to specific points of interest.
Technical Paper

Sensor-Fused Low Light Pedestrian Detection System with Transfer Learning

2024-04-09
2024-01-2043
Objection detection using a camera sensor is essential for developing Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD) vehicles. Due to the recent advancement in deep Convolution Neural Networks (CNNs), object detection based on CNNs has achieved state-of-the-art performance during daytime. However, using an RGB camera alone in object detection under poor lighting conditions, such as sun flare, snow, and foggy nights, causes the system's performance to drop and increases the likelihood of a crash. In addition, the object detection system based on an RGB camera performs poorly during nighttime because the camera sensors are susceptible to lighting conditions. This paper explores different pedestrian detection systems at low-lighting conditions and proposes a sensor-fused pedestrian detection system under low-lighting conditions, including nighttime. The proposed system fuses RGB and infrared (IR) thermal camera information.
X