Joint Calibration of Dual LiDARs and Camera using a Circular Chessboard 2020-01-0098
Environment perception is a crucial subsystem in autonomous vehicles. In order to build safety and efficient traffic transportation, several researches have been proposed to build accurate, robust and real-time perception systems. Camera and LiDAR are widely mounted on self-driving cars and developed with many algorithms in recent years. The fusion system of camera and LiDAR provides state-of the-art methods for environmental perception due to the defects of single vehicular sensor. Extrinsic parameter calibration is able to align the coordinate systems of sensors and has been drawing enormous attention. However, differ from spatial alignment of two sensors’ data, joint calibration of multi-sensors (more than three devices) should balance the degree of alignment between each one. In this paper, we assemble a test platform which is made up of dual LiDARs and monocular camera and is the same as the sensing hardware architecture of intelligent sweeper designed by our laboratory. Meanwhile, we propose the related joint calibration method using a circular chessboard. The center of circular chessboard is respectively detected in camera image to get pixel coordinates and in point cloud of LiDAR to get 3D coordinates. The calibration problem is then converted into a 3D-2D PnP matching problem and the center of the chessboard is set as corresponding points to construct the geometric constraints to get initial calibration values. Further, a proper global loss function is elaborately designed for Levenberg-Marquardt nonlinear optimization to obtain the finally calibration parameters and the extrinsic parameters between any two sensors are acquired simultaneously. Experimental results show that the proposed method is suitable for the joint calibration of fusion system composed of LiDARs and camera, and the calibration results have high accuracy and stability.
Zhenwen Deng, Lu Xiong, Dong Yin, Fengwu Shan