Vision Based Object Distance Estimation 2017-01-0109
This work describes a single camera based object distance estimation system. As technology on vehicles is constantly advancing on the road to autonomy, it is critical to know the locations of objects in 3D space for safe behavior of the vehicle. Though significant progress has been made on object detection in 2D sensor space from a single camera, this work additionally estimates the distance to said object without requiring stereo vision or absolute knowledge of vehicle motion. Specifically, our proposed system is comprised of three modules: vision based ego-motion estimation, object-detection, and distance estimation. In particular, we compensate for the vehicle ego-motion by using pin-hole camera model to increase the accuracy of the object distance estimation. In the ego-motion estimation stage, the proposed system utilizes the state-of-art technology, Oriented FAST and Rotated Brief (ORB) feature detector and descriptor, to robustly estimate the feature correspondence between the consecutive image frames. The six degrees-of-freedom ego motion estimation is then carried out by decomposing the essential matrix that is estimated from feature correspondences. The ego-motion estimations are further refined by bundle adjustment within a local temporal window. Finally, we use a deep neural network (DNN) in the object detection module, followed by distance estimation of the detected object based on pin-hole camera model. The proposed mono-camera system yields reliable distance estimation with low cost, as well as small overall data throughput. The estimation accuracy can be further improved with the fusion of the additional sensors (e.g. Radar, Lidar, ultrasonic).