Structured Light-Based 3D Reconstruction System for Plants

Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system t...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 15; no. 8; pp. 18587 - 18612
Main Authors Nguyen, Thuy, Slaughter, David, Max, Nelson, Maloof, Julin, Sinha, Neelima
Format Journal Article
LanguageEnglish
Published Switzerland MDPI 29.07.2015
MDPI AG
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s150818587

Cover

More Information
Summary:Camera-based 3D reconstruction of physical objects is one of the most popular computer vision trends in recent years. Many systems have been built to model different real-world subjects, but there is lack of a completely robust system for plants. This paper presents a full 3D reconstruction system that incorporates both hardware structures (including the proposed structured light system to enhance textures on object surfaces) and software algorithms (including the proposed 3D point cloud registration and plant feature measurement). This paper demonstrates the ability to produce 3D models of whole plants created from multiple pairs of stereo images taken at different viewing angles, without the need to destructively cut away any parts of a plant. The ability to accurately predict phenotyping features, such as the number of leaves, plant height, leaf size and internode distances, is also demonstrated. Experimental results show that, for plants having a range of leaf sizes and a distance between leaves appropriate for the hardware design, the algorithms successfully predict phenotyping features in the target crops, with a recall of 0.97 and a precision of 0.89 for leaf detection and less than a 13-mm error for plant size, leaf size and internode distance.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s150818587