Design of computer deep image processing method integrating dual branch multi-scale features and completing network
To address the difficulties of traditional 2D image processing methods in segmenting crops and backgrounds in complex natural environments, and the inability to accurately obtain 3D phenotype information, this study proposes a 3D image processing technology that integrates depth information. Firstly...
Saved in:
Published in | Journal of computational methods in sciences and engineering |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
19.07.2025
|
Online Access | Get full text |
ISSN | 1472-7978 1875-8983 |
DOI | 10.1177/14727978251361861 |
Cover
Summary: | To address the difficulties of traditional 2D image processing methods in segmenting crops and backgrounds in complex natural environments, and the inability to accurately obtain 3D phenotype information, this study proposes a 3D image processing technology that integrates depth information. Firstly, a dual branch multi-scale feature encoder was designed to process RGB images and depth images separately, and the feature extraction and fusion capabilities were enhanced by improving the residual module. Then, based on this encoder, a deep image completion network, a deep image super-resolution network, and an image edge detection algorithm were constructed. Finally, the effectiveness of the proposed method was validated on multiple datasets such as BSDS500 and KITTI. The results show that in the ablative experiment, the dual branch multi-scale feature encoder performs the best in accuracy, recall, F1 score, and mean square error, with values of 0.98, 0.82, 0.83, and 0.12, respectively. In the task of completing 200 incomplete image samples, the signal-to-noise ratio of the deep image completion network is mainly distributed between 6 dB and 8 dB. Research outcomes show that the raised method performs well in computer deep image processing, providing strong support for fields such as precision agriculture. |
---|---|
ISSN: | 1472-7978 1875-8983 |
DOI: | 10.1177/14727978251361861 |