Reconfigurable Sensors With Shape-Shifting Robots
The integration of multiple sensors plays a crucial role in various applications such as autonomous navigation, object detection, and perception. Achieving accurate and meaningful results necessitates a robust calibration approach, particularly focusing on the intrinsic and extrinsic calibration of...
Saved in:
| Published in | IEEE sensors journal Vol. 25; no. 20; pp. 38754 - 38768 |
|---|---|
| Main Authors | , , , , |
| Format | Journal Article |
| Language | English |
| Published |
New York
IEEE
15.10.2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects | |
| Online Access | Get full text |
| ISSN | 1530-437X 1558-1748 |
| DOI | 10.1109/JSEN.2025.3605064 |
Cover
| Summary: | The integration of multiple sensors plays a crucial role in various applications such as autonomous navigation, object detection, and perception. Achieving accurate and meaningful results necessitates a robust calibration approach, particularly focusing on the intrinsic and extrinsic calibration of sensors, assuming their fixed positioning relative to each other. However, the emergence of reconfigurable robots capable of altering their morphology presents a unique challenge in sensor calibration. Unlike conventional setups where sensors remain stationary postcalibration, the dynamic nature of reconfigurable robots results in the constant change of sensor positions. This necessitates continuous adjustments in calibration to accommodate these positional variations, ensuring the system's operational integrity and accuracy. Failure to address this can result in issues like the hallucination effect, impairing the robot's operational efficacy. This article proposes an adaptive transformation approach to address this challenge effectively, enabling real-time tracking of changes in the relative pose of sensors. In this article, we validate the efficacy of the proposed scheme through simulation and real-life experiment results that showcase the reduction of sensor hallucination while ensuring the robustness of the robot's perception capabilities. This article presents and discusses results demonstrating the effectiveness of the adaptive transformation approach in two distinct cases. Importantly, the proposed framework is designed to be generic, making it applicable to a wide range of multisensory calibration scenarios where each sensor is subject to changes in relative pose. Our proposed algorithm outperforms the state-of-the-art method by a factor of 31, achieving a root mean square error (RMSE) of just 2 ms. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 1530-437X 1558-1748 |
| DOI: | 10.1109/JSEN.2025.3605064 |