Motion Guided LiDAR-Camera Self-calibration and Accelerated Depth Upsampling for Autonomous Vehicles

This work proposes a novel motion guided method for targetless self-calibration of a LiDAR and camera and use the re-projection of LiDAR points onto the image reference frame for real-time depth upsampling. The calibration parameters are estimated by optimizing an objective function that penalizes d...

Full description

Saved in:
Bibliographic Details
Published inJournal of intelligent & robotic systems Vol. 100; no. 3-4; pp. 1129 - 1138
Main Authors Castorena, Juan, Puskorius, Gintaras V., Pandey, Gaurav
Format Journal Article
LanguageEnglish
Published Dordrecht Springer Netherlands 01.12.2020
Springer
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0921-0296
1573-0409
DOI10.1007/s10846-020-01233-w

Cover

More Information
Summary:This work proposes a novel motion guided method for targetless self-calibration of a LiDAR and camera and use the re-projection of LiDAR points onto the image reference frame for real-time depth upsampling. The calibration parameters are estimated by optimizing an objective function that penalizes distances between 2D and re-projected 3D motion vectors obtained from time-synchronized image and point cloud sequences. For upsampling, a simple, yet effective and time efficient formulation that minimizes depth gradients subject to an equality constraint involving the LiDAR measurements is proposed. Validation is performed on recorded real data from urban environments and demonstrations that our two methods are effective and suitable to mobile robotics and autonomous vehicle applications imposing real-time requirements is shown.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0921-0296
1573-0409
DOI:10.1007/s10846-020-01233-w