Adoption of Convolutional Neural Network Algorithm Combined with Augmented Reality in Building Data Visualization and Intelligent Detection

It aims to improve the degree of visualization of building data, ensure the ability of intelligent detection, and effectively solve the problems encountered in building data processing. Convolutional neural network and augmented reality technology are adopted, and a building visualization model base...

Full description

Saved in:
Bibliographic Details
Published inComplexity (New York, N.Y.) Vol. 2021; no. 1
Main Authors Wei, Minghui, Tang, Jingjing, Tang, Haotian, Zhao, Rui, Gai, Xiaohui, Lin, Renying
Format Journal Article
LanguageEnglish
Published Hoboken Hindawi 2021
John Wiley & Sons, Inc
Wiley
Subjects
Online AccessGet full text
ISSN1076-2787
1099-0526
1099-0526
DOI10.1155/2021/5161111

Cover

More Information
Summary:It aims to improve the degree of visualization of building data, ensure the ability of intelligent detection, and effectively solve the problems encountered in building data processing. Convolutional neural network and augmented reality technology are adopted, and a building visualization model based on convolutional neural network and augmented reality is proposed. The performance of the proposed algorithm is further confirmed by performance verification on public datasets. It is found that the building target detection model based on convolutional neural network and augmented reality has obvious advantages in algorithm complexity and recognition accuracy. It is 25 percent more accurate than the latest model. The model can make full use of mobile computing resources, avoid network delay and dependence, and guarantee the real-time requirement of data processing. Moreover, the model can also well realize the augmented reality navigation and interaction effect of buildings in outdoor scenes. To sum up, this study provides a research idea for the identification, data processing, and intelligent detection of urban buildings.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1076-2787
1099-0526
1099-0526
DOI:10.1155/2021/5161111