WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch

We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartpho...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in robotics and AI Vol. 11; p. 1478016
Main Authors Weigend, Fabian C., Kumar, Neelesh, Aran, Oya, Ben Amor, Heni
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Media S.A 2024
Subjects
Online AccessGet full text
ISSN2296-9144
2296-9144
DOI10.3389/frobt.2024.1478016

Cover

More Information
Summary:We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture .
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2296-9144
2296-9144
DOI:10.3389/frobt.2024.1478016