Creating 3D Virtual Driving Environments for Simulation-Aided Development of Autonomous Driving and Active Safety 2017-01-0107
Recreating traffic scenarios for testing autonomous driving in the real world requires significant time, resources and expense, and can present a safety risk if hazardous scenarios are tested. Using a 3D virtual environment to enable testing of many of these traffic scenarios on the desktop or cluster significantly reduces the amount of required road tests. In order to facilitate the development of perception and control algorithms for level 4 autonomy, a shared memory interface between MATLAB, Simulink, and Unreal Engine 4 can send information (such as vehicle control signals) back to the virtual environment. The shared memory interface conveys arbitrary numerical data, RGB image data, and point cloud data for the simulation of LiDAR sensors. The interface consists of a plugin for Unreal Engine, which contains the necessary read/write functions, and a beta toolbox for MATLAB, capable of reading and writing from the same shared memory locations specified in Unreal Engine, MATLAB, and Simulink. The LiDAR sensor model was tested by generating point clouds with beam patterns that mimic Velodyne HDL-32E (32 beam) sensors and is demonstrated to run at sufficient frame rates for real-time computations by leveraging the Graphics Processing Unit (GPU).