Improving RGB Image Consistency for Depth-Camera based Reconstruction through Image Warping

Indoor reconstruction using depth camera algorithms (e.g., InfiniTAMv3) is becoming increasingly popular. Simple reconstruction methods solely use the frames of the depth camera, leaving any imagery from the adjunct RGB camera untouched. Recent approaches also incorporate color camera information to...

Full description

Saved in:
Bibliographic Details
Published inJournal of WSCG Vol. 28; no. 1-2; pp. 105 - 113
Main Authors Reyes-Aviles, Fernando, Fleck, Philipp, Schmalstieg, Dieter, Arth, Clemens
Format Journal Article
LanguageEnglish
Published Plsen Union Agency 2020
Subjects
Online AccessGet full text
ISSN1213-6972
1213-6980
1213-6964
1213-6964
DOI10.24132/JWSCG.2020.28.13

Cover

More Information
Summary:Indoor reconstruction using depth camera algorithms (e.g., InfiniTAMv3) is becoming increasingly popular. Simple reconstruction methods solely use the frames of the depth camera, leaving any imagery from the adjunct RGB camera untouched. Recent approaches also incorporate color camera information to improve consistency. However, the results heavily depend on the accuracy of the rig calibration, which can strongly vary in quality. Unfortunately, any errors in the rig calibration result in apparent visual discrepancies when it comes to colorization of the 3D reconstruction. We propose an easy approach to fix this issue for the purpose of image-based rendering. We show that a relatively simple warping function can be calculated from a 3D checkerboard pattern for a rig with poor calibration between cameras. The warping is applied to the RGB images online during reconstruction, leading to a significantly improved visual result.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1213-6972
1213-6980
1213-6964
1213-6964
DOI:10.24132/JWSCG.2020.28.13