Joint Calibration of Dual LiDARs and Camera Using a Circular Chessboard 2020-01-0098
Environmental perception is a crucial subsystem in autonomous vehicles. In order to build safe and efficient traffic transportation, several researches have been proposed to build accurate, robust and real-time perception systems. Camera and LiDAR are widely equipped on autonomous self-driving cars and developed with many algorithms in recent years. The fusion system of camera and LiDAR provides state-of the-art methods for environmental perception due to the defects of single vehicular sensor. Extrinsic parameter calibration is able to align the coordinate systems of sensors and has been drawing enormous attention. However, differ from spatial alignment of two sensors’ data, joint calibration of multi-sensors (more than two sensors) should balance the degree of alignment between each two sensors. In this paper, we assemble a test platform which is made up of dual LiDARs and one monocular camera and use the same sensing hardware architecture as intelligent sweeper designed by our laboratory. Meanwhile, we propose the related joint calibration method using a circular chessboard. The center of circular chessboard is respectively detected in camera image to get pixel coordinates and in point cloud of LiDAR to get 3D coordinates. The calibration problem is then converted into a 3D-2D PnP matching problem and the center of the chessboard is set as corresponding points to construct the geometric constraints to get initial calibration values. Further, a proper global loss function is elaborately designed for Levenberg-Marquardt nonlinear optimization to obtain the final calibration parameters, and then the extrinsic parameters between any two sensors are estimated simultaneously. Experimental results show that the proposed method is suitable for the joint calibration of fusion system composed of LiDARs and camera, and the calibration results have high accuracy and stability.