Browse Publications Technical Papers 2020-01-1029
2020-04-14

Autonomous Vehicle Multi-Sensors Localization in Unstructured Environment 2020-01-1029

Autonomous driving in unstructured environments is a significant challenge due to the inconsistency of important information for localization such as lane markings. To reduce the uncertainty of vehicle localization in such environments, sensor fusion of LiDAR, Radar, Camera, GPS/IMU, and Odometry sensors is utilized. This paper discusses a hybrid localization technique developed using: LiDAR based Simultaneous Localization and Mapping (SLAM), GPS/IMU and Odometry data, and object lists from Radar and Camera sensors. An Extended Kalman Filter (EKF) is utilized to fuse data from all sensors in two phases. In the preliminary stage, the SLAM-based vehicle coordinates are fused with the GPS-based positioning. The output of this stage is then fused with the objects-based localization. This approach was successfully tested on FEV’s Smart Vehicle Demonstrator at FEV’s HQ representing a complicated test environment with dynamic and static objects. The test results show that multi-sensor fusion improves the vehicle’s localization compared to GPS or LiDAR alone.

SAE MOBILUS

Subscribers can view annotate, and download all of SAE's content. Learn More »

Access SAE MOBILUS »

Attention: This item is not yet published. Pre-Order to be notified, via email, when it becomes available.
Members save up to 18% off list price.
Login to see discount.
Special Offer: With TechSelect, you decide what SAE Technical Papers you need, when you need them, and how much you want to pay.
X