Mixed Reality User Interface for a Hybrid Operation Room
Simultaneous use of multiple complex devices for imaging and intervention purposes in a Hybrid operation room creates new challenges to the medical staff, such as avoiding collisions of different systems. Augmented reality presents a potential solution to this challenge. By visually overlaying real...
Saved in:
| Published in | IEEE International Symposium on Mixed and Augmented Reality Workshops (Online) pp. 833 - 838 |
|---|---|
| Main Authors | , , , |
| Format | Conference Proceeding |
| Language | English |
| Published |
IEEE
01.10.2022
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2771-1110 |
| DOI | 10.1109/ISMAR-Adjunct57072.2022.00180 |
Cover
| Summary: | Simultaneous use of multiple complex devices for imaging and intervention purposes in a Hybrid operation room creates new challenges to the medical staff, such as avoiding collisions of different systems. Augmented reality presents a potential solution to this challenge. By visually overlaying real devices with their digital twin, users can see anticipated movements of the medical devices in a head-mounted display. With this information they can identify potential collisions before moving the device. This paper focuses on the design of a novel mixed reality user interface for a Hybrid operating room, which enables users to control multiple devices virtually. Two versions of the mixed reality user interface with different user input approaches are compared in a within-subject study regarding their usability and their suitability in steering medical devices virtually. Users were significantly faster completing study tasks with the input module press and hold button than with a virtual joystick. Additionally user experience was rated higher for the press and hold button. |
|---|---|
| ISSN: | 2771-1110 |
| DOI: | 10.1109/ISMAR-Adjunct57072.2022.00180 |