Analysis of LiDAR and Camera Data in Real-World Weather Conditions for Autonomous Vehicle Operations 2020-01-0093
Autonomous vehicle technology has the potential to improve the safety, efficiency, and cost of our current transportation system by removing human error. With the sensors available today, it is possible for the development of these vehicles, however, there are still issues with autonomous vehicle operations in adverse weather conditions (e.g. snow-covered roads, heavy rain, fog, etc.) due to the degradation of sensor data quality. Since autonomous vehicles rely entirely on sensor data to perceive their surrounding environment, this becomes a significant issue in the performance of the autonomous system. The purpose of this study is to collect sensor data under various weather conditions to understand the effects of weather on sensor data. The sensors used in this study were one camera and one LiDAR. These sensors were connected to an NVIDIA Drive Px2 which operated in a 2019 Kia Niro. Two custom scenarios (static and dynamic) were chosen to collect sensor data operating in four real-world weather conditions: fair, cloudy, rainy, and snowy. This data was then analyzed in custom detection algorithms written in python to provide a method of quantifying the data for comparison against the other weather conditions. The results from these performance algorithms show that sensor data quality degrades by an average of 13.88% for static objects and 16.16% for dynamic objects while operating in these conditions, with operations in rain proving to have the most significant effect on sensor data degradation. From this study, it is hypothesized that advancements in data processing algorithms can improve the usability of this degraded data. In future work, we seek to explore fault-tolerant sensor fusion algorithms that can overcome the effects of adverse weather.
Nick Goberville, Mohammad El-Yabroudi, Mark Omwanas, Johan Rojas, Rick Meyer, Zachary Asher, Ikhlas Abdel-Qader