Multi‐view stereo for weakly textured indoor 3D reconstruction

A 3D reconstruction enables an effective geometric representation to support various applications. Recently, learning‐based multi‐view stereo (MVS) algorithms have emerged, replacing conventional hand‐crafted features with convolutional neural network‐encoded deep representation to reduce feature ma...

Full description

Saved in:
Bibliographic Details
Published inComputer-aided civil and infrastructure engineering Vol. 39; no. 10; pp. 1469 - 1489
Main Authors Wang, Tao, Gan, Vincent J. L.
Format Journal Article
LanguageEnglish
Published Hoboken Wiley Subscription Services, Inc 01.05.2024
Subjects
Online AccessGet full text
ISSN1093-9687
1467-8667
1467-8667
DOI10.1111/mice.13149

Cover

More Information
Summary:A 3D reconstruction enables an effective geometric representation to support various applications. Recently, learning‐based multi‐view stereo (MVS) algorithms have emerged, replacing conventional hand‐crafted features with convolutional neural network‐encoded deep representation to reduce feature matching ambiguity, leading to a more complete scene recovery from imagery data. However, the state‐of‐the‐art architectures are not designed for an indoor environment with abundant weakly textured or textureless objects. This paper proposes AttentionSPP‐PatchmatchNet, a deep learning‐based MVS algorithm designed for indoor 3D reconstruction. The algorithm integrates multi‐scale feature sampling to produce global‐context‐aware feature maps and recalibrates the weight of essential features to tackle challenges posed by indoor environments. A new dataset designed exclusively for indoor environments is presented to verify the performance of the proposed network. Experimental results show that AttentionSPP‐PatchmatchNet outperforms state‐of‐the‐art algorithms with relative 132.87% and 163.55% improvements at the 10 and 2 mm threshold, respectively, making it suitable for accurate and complete indoor 3D reconstruction.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1093-9687
1467-8667
1467-8667
DOI:10.1111/mice.13149