Refine Your Search

Search Results

Viewing 1 to 7 of 7
Technical Paper

Observer for Faulty Perception Correction in Autonomous Vehicles

2020-04-14
2020-01-0694
Operation of an autonomous vehicle (AV) carries risk if it acts on inaccurate information about itself or the environment. The perception system is responsible for interpreting the world and providing the results to the path planning and other decision systems. The perception system performance is a result of the operating state of the sensors, e.g. is a sensor in fault or being adversely affected by the weather or environmental conditions, and approach to sensor measurement interpretation. We propose a trailing horizon switched system observer that minimizes the difference between reference tracking values developed from sensor fusion performed at an upper level and the values from a potentially faulty sensor based upon the convex combination of different sensor observation model outputs; the sensor observations models are associated with different sensor operating errors.
Technical Paper

Vehicle Velocity Prediction Using Artificial Neural Network and Effect of Real World Signals on Prediction Window

2020-04-14
2020-01-0729
Prediction of vehicle velocity is important since it can realize improvements in the fuel economy/energy efficiency, drivability, and safety. Velocity prediction has been addressed in many publications. Several references considered deterministic and stochastic approaches such as Markov chain, autoregressive models, and artificial neural networks. There are numerous new sensor and signal technologies like vehicle-to-vehicle and vehicle-to-infrastructure communication that can be used to obtain inclusive datasets. Using these inclusive datasets of sensors in deep neural networks, high accuracy velocity predictions can be achieved. This research builds upon previous findings that Long Short-Term Memory (LSTM) deep neural networks provide low error velocity prediction. We developed an LSTM deep neural network that uses different groups of datasets collected in Fort Collins, Colorado.
Technical Paper

Higher Accuracy and Lower Computational Perception Environment Based Upon a Real-time Dynamic Region of Interest

2022-03-29
2022-01-0078
Robust sensor fusion is a key technology for enabling the safe operation of automated vehicles. Sensor fusion typically utilizes inputs of cameras, radars, lidar, inertial measurement unit, and global navigation satellite systems, process them, and then output object detection or positioning data. This paper will focus on sensor fusion between the camera, radar, and vehicle wheel speed sensors which is a critical need for near-term realization of sensor fusion benefits. The camera is an off-the-shelf computer vision product from MobilEye and the radar is a Delphi/Aptive electronically scanning radar (ESR) both of which are connected to a drive-by-wire capable vehicle platform. We utilize the MobilEye and wheel speed sensors to create a dynamic region of interest (DROI) of the drivable region that changes as the vehicle moves through the environment.
Technical Paper

Vehicle Lateral Offset Estimation Using Infrastructure Information for Reduced Compute Load

2023-04-11
2023-01-0800
Accurate perception of the driving environment and a highly accurate position of the vehicle are paramount to safe Autonomous Vehicle (AV) operation. AVs gather data about the environment using various sensors. For a robust perception and localization system, incoming data from multiple sensors is usually fused together using advanced computational algorithms, which historically requires a high-compute load. To reduce AV compute load and its negative effects on vehicle energy efficiency, we propose a new infrastructure information source (IIS) to provide environmental data to the AV. The new energy–efficient IIS, chip–enabled raised pavement markers are mounted along road lane lines and are able to communicate a unique identifier and their global navigation satellite system position to the AV. This new IIS is incorporated into an energy efficient sensor fusion strategy that combines its information with that from traditional sensor.
Technical Paper

Quantitative Resilience Assessment of GPS, IMU, and LiDAR Sensor Fusion for Vehicle Localization Using Resilience Engineering Theory

2023-04-11
2023-01-0576
Practical applications of recently developed sensor fusion algorithms perform poorly in the real world due to a lack of proper evaluation during development. Existing evaluation metrics do not properly address a wide variety of testing scenarios. This issue can be addressed using proactive performance measurements such as the tools of resilience engineering theory rather than reactive performance measurements such as root mean square error. Resilience engineering is an established discipline for evaluating proactive performance on complex socio-technical systems which has been underutilized for automated vehicle development and evaluation. In this study, we use resilience engineering metrics to assess the performance of a sensor fusion algorithm for vehicle localization. A Kalman Filter is used to fuse GPS, IMU and LiDAR data for vehicle localization in the CARLA simulator.
Journal Article

Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations

2020-04-14
2020-01-0093
Autonomous vehicle technology has the potential to improve the safety, efficiency, and cost of our current transportation system by removing human error. With sensors available today, it is possible for the development of these vehicles, however, there are still issues with autonomous vehicle operations in adverse weather conditions (e.g. snow-covered roads, heavy rain, fog, etc.) due to the degradation of sensor data quality and insufficiently robust software algorithms. Since autonomous vehicles rely entirely on sensor data to perceive their surrounding environment, this becomes a significant issue in the performance of the autonomous system. The purpose of this study is to collect sensor data under various weather conditions to understand the effects of weather on sensor data. The sensors used in this study were one camera and one LiDAR. These sensors were connected to an NVIDIA Drive Px2 which operated in a 2019 Kia Niro.
Technical Paper

Real World Use Case Evaluation of Radar Retro-reflectors for Autonomous Vehicle Lane Detection Applications

2024-04-09
2024-01-2042
Lane detection plays a critical role in autonomous vehicles for safe and reliable navigation. Lane detection is traditionally accomplished using a camera sensor and computer vision processing. The downside of this traditional technique is that it can be computationally intensive when high quality images at a fast frame rate are used and has reliability issues from occlusion such as, glare, shadows, active road construction, and more. This study addresses these issues by exploring alternative methods for lane detection in specific scenarios caused from road construction-induced lane shift and sun glare. Specifically, a U-Net, a convolutional network used for image segmentation, camera-based lane detection method is compared with a radar-based approach using a new type of sensor previously unused in the autonomous vehicle space: radar retro-reflectors.
X