An Efficient Dense Stereo Matching Method for Planetary Rover

Stereo matching is one of the most important and challenging subjects in the field of planetary rover with a stereo vision system. The generated disparity map can make rover to avoid the obstacle and explore the planetary surface automatically. In this paper, we propose an efficient dense stereo mat...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; pp. 48551 - 48564
Main Authors Li, Haichao, Chen, Liang, Li, Feng
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2169-3536
2169-3536
DOI10.1109/ACCESS.2019.2910194

Cover

More Information
Summary:Stereo matching is one of the most important and challenging subjects in the field of planetary rover with a stereo vision system. The generated disparity map can make rover to avoid the obstacle and explore the planetary surface automatically. In this paper, we propose an efficient dense stereo matching method to generate disparity maps for a planetary rover, which relies on 3-D plane fitting, adaptive penalties, and coarse-to-fine disparity constraint. In order to achieve efficient stereo matching at the coarsest level of the pyramid, we present a 3-D plane fitting to reduce the disparity search range and propose adaptive penalties in the more-global matching method for obtaining an accurate disparity map. At the finer level, our method then estimates the disparity search range based on coarse-to-fine disparity constraint and utilizes adaptive penalties to obtain an accurate disparity map. The extensive experiments with stereo images of Chang'e-3 rover demonstrate that our approach can generate disparity maps efficiently and accurately compared with the most state-of-the-art semi-global matching methods, especially at low texture regions, depth discontinuities, and occlusion regions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2910194