Wide-area scene mapping for mobile visual tracking
We propose a system for easily preparing arbitrary wide-area environments for subsequent real-time tracking with a handheld device. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared environment. We combine panoramas captured usi...
Saved in:
Published in | 2012 IEEE International Symposium on Mixed and Augmented Reality pp. 3 - 12 |
---|---|
Main Authors | , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.11.2012
|
Subjects | |
Online Access | Get full text |
ISBN | 9781467346603 1467346608 |
DOI | 10.1109/ISMAR.2012.6402531 |
Cover
Summary: | We propose a system for easily preparing arbitrary wide-area environments for subsequent real-time tracking with a handheld device. Our system evaluation shows that minimal user effort is required to initialize a camera tracking session in an unprepared environment. We combine panoramas captured using a handheld omnidirectional camera from several viewpoints to create a point cloud model. After the offline modeling step, live camera pose tracking is initialized by feature point matching, and continuously updated by aligning the point cloud model to the camera image. Given a reconstruction made with less than five minutes of video, we achieve below 25 cm translational error and 0.5 degrees rotational error for over 80% of images tested. In contrast to camera-based simultaneous localization and mapping (SLAM) systems, our methods are suitable for handheld use in large outdoor spaces. |
---|---|
ISBN: | 9781467346603 1467346608 |
DOI: | 10.1109/ISMAR.2012.6402531 |