Robot-assisted ultrasound probe calibration for image-guided interventions

Background Trackable ultrasound probes facilitate ultrasound-guided procedures, allowing real-time fusion of augmented ultrasound images and live video streams. The integration aids surgeons in accurately locating lesions within organs, and this could only be achieved through a precise registration...

Full description

Saved in:
Bibliographic Details
Published inInternational journal for computer assisted radiology and surgery Vol. 20; no. 5; pp. 859 - 868
Main Authors Paralikar, Atharva, Mantripragada, Pavan, Nguyen, Trong, Arjoune, Youness, Shekhar, Raj, Monfaredi, Reza
Format Journal Article
LanguageEnglish
Published Cham Springer International Publishing 01.05.2025
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1861-6429
1861-6410
1861-6429
DOI10.1007/s11548-025-03347-8

Cover

More Information
Summary:Background Trackable ultrasound probes facilitate ultrasound-guided procedures, allowing real-time fusion of augmented ultrasound images and live video streams. The integration aids surgeons in accurately locating lesions within organs, and this could only be achieved through a precise registration between the ultrasound probe and the ultrasound image. Currently, calibration and registration processes are often manual, labor-intensive, time-consuming, and suboptimal. Technologists manually manipulate a stylus, moving it through various poses within the ultrasound probe’s imaging plane to detect its tip in the ultrasound image. This paper addresses this challenge by proposing a novel automated calibration approach for trackable ultrasound probes. Methods We utilized a robotic manipulator (KUKA LBR iiwa 7) to execute stylus movements, eliminating the cumbersome manual positioning of the probe. We incorporated a 6-degree-of-freedom electromagnetic tracker into the ultrasound probe to enable real-time pose and orientation tracking. Also, we developed a feature detection algorithm to effectively identify in plane stylus tip coordinates from recorded ultrasound feeds, facilitating automatic selection of calibration correspondences. Results The proposed system performed comparably to manual ultrasound feature segmentation, yielding a mean re-projection error of 0.38 mm compared to a manual landmark selection error of 0.34 mm. We also achieved an image plane reconstruction of 0.80 deg with manual segmentation and 0.20 deg with automatic segmentation. Conclusion The proposed system allowed for fully automated calibration while maintaining the same level of accuracy as the state-of-the-art methods. It streamlines the process of using a trackable US probe by simplifying recalibration after sterilization when the electromagnetic tracker is externally attached and is required to be disassembled for cleaning and sterilization, or as a part of out-of-factory calibration of US probe with embedded trackers where probes are in mass production.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1861-6429
1861-6410
1861-6429
DOI:10.1007/s11548-025-03347-8