A joint calibration system of lidar and binocular cameras based on an improved calibration board and DON algorithm

For calibration of lidar and vision, the point cloud data is irregular and noisy. Meanwhile, the outliers need to be removed due to the occlusion caused by the simple calibration plate. A joint calibration method of binocular cameras and lidar based on improved calibration plate and DON algorithm is...

Full description

Saved in:
Bibliographic Details
Main Authors Huang, Zhibin, Chen, Guochu, Wang, Xichao, Li, Baojiang
Format Conference Proceeding
LanguageEnglish
Published SPIE 03.02.2023
Online AccessGet full text
ISBN9781510661363
1510661360
ISSN0277-786X
DOI10.1117/12.2660421

Cover

More Information
Summary:For calibration of lidar and vision, the point cloud data is irregular and noisy. Meanwhile, the outliers need to be removed due to the occlusion caused by the simple calibration plate. A joint calibration method of binocular cameras and lidar based on improved calibration plate and DON algorithm is proposed. Firstly, circular bulges of different sizes are added to the rectangular calibration plate, and then the point cloud data is filtered and segmented by the DON (Difference of Normal) algorithm. In addition, the random sample consistency (RANSAC) algorithm is used to estimate the plane and edge parameters of the triangle plate by retaining point clouds to obtain the three-dimensional positions of vertices. Finally, the projection matrix between the camera and lidar is estimated using 2D-3D corresponding points at different positions. The projection errors and root mean square errors of different frames and corresponding points are calculated. The results show that the average error of 100 frames is reduced by 5.3% compared with 1 frame. The root means square error (RMSE) of this method is 1.415cm. Compared with other advanced methods, the reliability and superiority of this method are verified.
Bibliography:Conference Location: Hulun Buir, China
Conference Date: 2022-08-19|2022-08-21
ISBN:9781510661363
1510661360
ISSN:0277-786X
DOI:10.1117/12.2660421