Understanding How Rain Affects Semantic Segmentation Algorithm Performance 2020-01-0092
Research interests in autonomous driving have increased significantly in recent years. Several methods are being suggested for performance optimization of autonomous vehicles. However, weather conditions such as rain, snow, and fog may hinder the performance of autonomous algorithms. It is therefore of great importance to study how the performance/efficiency of the underlying scene understanding algorithms vary with such adverse scenarios. Semantic segmentation is one of the most widely used scene-understanding techniques applied to autonomous driving. In this work, we study the performance degradation of several semantic segmentation algorithms caused by rain for off-road driving scenes. Given the limited availability of datasets for real-world off-road driving scenarios that include rain, we utilize two types of synthetic datasets. The first dataset is a pure synthetic rainy dataset which considers the rain droplets on a camera lens, which is suitable for an autonomous vehicle with outside-mounted cameras. This data is generated by the MAVS simulator. In the second dataset, we take good-weather imagery and artificially incorporate rain streaks. By investigating different simulated rain rates, we quantify the performance of such algorithms and witness the severe performance degradation with increasing rain density. We also propose and analyze two methods to obtain the robust performance of segmentation algorithms for both clear and rainy weather.