Consistent and Accurate Ranging Estimation of Tiny Objects for Mobile Uncrewed Vehicles
In many industries, intelligent mobile uncrewed vehicles (MUVs) are transforming automation and enhancing business competitiveness. A key technology enabling precise task execution is monocular camera-based ranging, which estimates distances to relevant targets. However, existing methods often evalu...
Saved in:
| Published in | IEEE robotics and automation letters Vol. 10; no. 8; pp. 8356 - 8363 |
|---|---|
| Main Authors | , , , , , |
| Format | Journal Article |
| Language | English |
| Published |
Piscataway
IEEE
01.08.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects | |
| Online Access | Get full text |
| ISSN | 2377-3766 2377-3766 |
| DOI | 10.1109/LRA.2025.3585350 |
Cover
| Summary: | In many industries, intelligent mobile uncrewed vehicles (MUVs) are transforming automation and enhancing business competitiveness. A key technology enabling precise task execution is monocular camera-based ranging, which estimates distances to relevant targets. However, existing methods often evaluate targets in isolation, lacking spatial and temporal consistency across frames. To address this, we propose the Consistent and Accurate Ranging Estimation (CARE) framework, which delivers reliable measurements by capturing consistent spatial relationships between targets. CARE integrates two key components: 2D-sense features and a hierarchical Consistent Ranging Network (CRN). The 2D-sense features fuse object detection with inferred depth to provide rich semantic features that enhance input quality, even for tiny objects. The CRN further ensures accurate and consistent ranging through customized network architecture and loss function design. Real-world experiments using data on tomato and rose flowers to simulate tiny-flower pollination demonstrate the effectiveness of our proposed method. The CARE framework outperforms existing approaches in both accuracy and consistency. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2377-3766 2377-3766 |
| DOI: | 10.1109/LRA.2025.3585350 |