Vox-Fusion: Dense Tracking and Mapping with Voxel-based Neural Implicit Representation

In this work, we present a dense tracking and mapping system named Vox-Fusion, which seamlessly fuses neural implicit representations with traditional volumetric fusion methods. Our approach is inspired by the recently developed implicit mapping and positioning system and further extends the idea so...

Full description

Saved in:
Bibliographic Details
Published in2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) pp. 499 - 507
Main Authors Yang, Xingrui, Li, Hai, Zhai, Hongjia, Ming, Yuhang, Liu, Yuqian, Zhang, Guofeng
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2022
Subjects
Online AccessGet full text
DOI10.1109/ISMAR55827.2022.00066

Cover

More Information
Summary:In this work, we present a dense tracking and mapping system named Vox-Fusion, which seamlessly fuses neural implicit representations with traditional volumetric fusion methods. Our approach is inspired by the recently developed implicit mapping and positioning system and further extends the idea so that it can be freely applied to practical scenarios. Specifically, we leverage a voxel-based neural implicit surface representation to encode and optimize the scene inside each voxel. Furthermore, we adopt an octree-based structure to divide the scene and support dynamic expansion, enabling our system to track and map arbitrary scenes without knowing the environment like in previous works. Moreover, we proposed a high-performance multi-process framework to speed up the method, thus supporting some applications that require real-time performance. The evaluation results show that our methods can achieve better accuracy and completeness than previous methods. We also show that our Vox-Fusion can be used in augmented reality and virtual reality applications. Our source code is publicly available at https://github.com/zju3dv/Vox-Fusion.
DOI:10.1109/ISMAR55827.2022.00066