WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch
We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartpho...
Saved in:
| Published in | Frontiers in robotics and AI Vol. 11; p. 1478016 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
Switzerland
Frontiers Media S.A
2024
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2296-9144 2296-9144 |
| DOI | 10.3389/frobt.2024.1478016 |
Cover
| Abstract | We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters.
www.github.com/wearable-motion-capture
. |
|---|---|
| AbstractList | We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters.
www.github.com/wearable-motion-capture
. We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture.We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture. We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control. WearMoCap operates in three modes: 1) a Watch Only mode, which uses a smartwatch only, 2) a novel Upper Arm mode, which utilizes the smartphone strapped onto the upper arm and 3) a Pocket mode, which determines body orientation from a smartphone in any pocket. We evaluate all modes on large-scale datasets consisting of recordings from up to 8 human subjects using a range of consumer-grade devices. Further, we discuss real-robot applications of underlying works and evaluate WearMoCap in handover and teleoperation tasks, resulting in performances that are within 2 cm of the accuracy of the gold-standard motion capture system. Our Upper Arm mode provides the most accurate wrist position estimates with a Root Mean Squared prediction error of 6.79 cm. To evaluate WearMoCap in more scenarios and investigate strategies to mitigate sensor drift, we publish the WearMoCap system with thorough documentation as open source. The system is designed to foster future research in smartwatch-based motion capture for robotics applications where ubiquity matters. www.github.com/wearable-motion-capture. |
| Author | Kumar, Neelesh Weigend, Fabian C. Aran, Oya Ben Amor, Heni |
| Author_xml | – sequence: 1 givenname: Fabian C. surname: Weigend fullname: Weigend, Fabian C. – sequence: 2 givenname: Neelesh surname: Kumar fullname: Kumar, Neelesh – sequence: 3 givenname: Oya surname: Aran fullname: Aran, Oya – sequence: 4 givenname: Heni surname: Ben Amor fullname: Ben Amor, Heni |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/39831285$$D View this record in MEDLINE/PubMed |
| BookMark | eNqNkcFuFSEUhompsbX2BVwYlm7ulQPDAO7MTdUmNS7UuCRnGKZOZYYpMGn69nJ7r41LVxDynf_8-XhJTuY4e0JeA9sKoc27IcWubDnjzRYapRm0z8gZ56bdGGiak3_up-Qi51vGGEjdCKVekFNhtACu5Rn59tNj-hJ3uLyn0xrKOMUeA11i9rQkdL_H-YYOMdG1G-_WscQ107o5FuriXFIMdM17BGmeMJV7LO7XK_J8wJD9xfE8Jz8-Xn7ffd5cf_10tftwvXGibcpGoOcDcuDMIzItlQEtjTJSOd47IWtHA60HHIQQxoDyXAOaTjnnpPRMnJOrQ24f8dYuaawNHmzE0T4-xHRja6XRBW-NFB361vctEw1gXd2idso3g1HAYKhZ4pC1zgs-3GMIT4HA7N64fTRu98bt0XidenuYWlK8W30udhqz8yHg7KspK0AqKQGYquibI7p2k--f0v9-RQX4AXAp5pz88D8F_gBvU51P |
| Cites_doi | 10.1016/j.jbiomech.2020.1098202020.109820 10.17667/riim.2018.1/13 10.1109/MeMeA.2018.8438623 10.1109/tbme.2023.3275775 10.1109/jiot.2021.3119328 10.1109/ICRA40945.2020.9196664 10.1007/s10514-019-09889-6 10.1145/3586183.3606821 10.1145/3272127.3275108 10.1016/j.cviu.2021.103275 10.1155/2021/6628320 10.1109/TRO.2023.3236952 10.1145/3570731 10.1109/JSEN.2020.3001635 10.1109/ICRA57147.2024.10610805 10.1109/access.2024.3350338ACCESS.2024.3350338 10.1145/3597623 10.1007/s12369-023-01095-w 10.1109/IROS.2017.8206016 |
| ContentType | Journal Article |
| Copyright | Copyright © 2025 Weigend, Kumar, Aran and Ben Amor. |
| Copyright_xml | – notice: Copyright © 2025 Weigend, Kumar, Aran and Ben Amor. |
| DBID | AAYXX CITATION NPM 7X8 ADTOC UNPAY DOA |
| DOI | 10.3389/frobt.2024.1478016 |
| DatabaseName | CrossRef PubMed MEDLINE - Academic Unpaywall for CDI: Periodical Content Unpaywall DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef PubMed MEDLINE - Academic |
| DatabaseTitleList | CrossRef MEDLINE - Academic PubMed |
| Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2296-9144 |
| ExternalDocumentID | oai_doaj_org_article_953bae6ed60341a3ae6a8c7e4f97101f 10.3389/frobt.2024.1478016 39831285 10_3389_frobt_2024_1478016 |
| Genre | Journal Article |
| GroupedDBID | 53G 5VS 9T4 AAFWJ AAYXX ACGFS ADBBV AFPKN ALMA_UNASSIGNED_HOLDINGS BCNDV CITATION GROUPED_DOAJ KQ8 M~E OK1 PGMZT RPM ACXDI IAO ICD IEA IPNFZ ISR NPM RIG 7X8 ADTOC UNPAY |
| ID | FETCH-LOGICAL-c364t-3ae2fa2120eaa0857918597957c2dc35983916e1af3339917e281a9b7ccc55e03 |
| IEDL.DBID | UNPAY |
| ISSN | 2296-9144 |
| IngestDate | Fri Oct 03 12:31:44 EDT 2025 Sun Oct 26 04:06:35 EDT 2025 Fri Sep 05 13:00:58 EDT 2025 Thu Jan 30 12:29:58 EST 2025 Wed Oct 01 04:23:06 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | smartwatch human-robot interaction IMU motion capture teleoperation drone control wearables motion capture |
| Language | English |
| License | Copyright © 2025 Weigend, Kumar, Aran and Ben Amor. cc-by |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c364t-3ae2fa2120eaa0857918597957c2dc35983916e1af3339917e281a9b7ccc55e03 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| OpenAccessLink | https://proxy.k.utb.cz/login?url=https://doi.org/10.3389/frobt.2024.1478016 |
| PMID | 39831285 |
| PQID | 3157551107 |
| PQPubID | 23479 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_953bae6ed60341a3ae6a8c7e4f97101f unpaywall_primary_10_3389_frobt_2024_1478016 proquest_miscellaneous_3157551107 pubmed_primary_39831285 crossref_primary_10_3389_frobt_2024_1478016 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2024-00-00 |
| PublicationDateYYYYMMDD | 2024-01-01 |
| PublicationDate_xml | – year: 2024 text: 2024-00-00 |
| PublicationDecade | 2020 |
| PublicationPlace | Switzerland |
| PublicationPlace_xml | – name: Switzerland |
| PublicationTitle | Frontiers in robotics and AI |
| PublicationTitleAlternate | Front Robot AI |
| PublicationYear | 2024 |
| Publisher | Frontiers Media S.A |
| Publisher_xml | – name: Frontiers Media S.A |
| References | Roetenberg (B23) 2009; 1 Li (B13) 2021; 9 Prayudi (B20) 2012 Shin (B24) 2023; 70 Weigend (B33) Hindle (B8) 2021; 2021 Macchini (B15) 2020 Nagymáté (B18) 2018; 5 Walker (B28) 2023; 12 Liu (B14) 2023 Topley (B25) 2020; 106 Gal (B6) 2016 Villani (B27); 20 Wang (B29) 2016 B31 Joukov (B10) 2017 Robinson (B22) 2023; 12 Beange (B1) 2018 Huang (B9) 2018; 37 Yang (B34) 2016 Wei (B30) 2021 Raghavendra (B21) 2017 Villani (B26); 44 DeVrio (B5) 2023 Mollyn (B17) 2023 Lee (B12) 2024 B3 Darvish (B2) 2023; 39 Desmarais (B4) 2021; 212 Lee (B11) 2015 Malleson (B16) 2017 Hauser (B7) 2024 Noh (B19) 2024; 12 Weigend (B32) 2024 |
| References_xml | – volume: 106 start-page: 109820 year: 2020 ident: B25 article-title: A comparison of currently available optoelectronic motion capture systems publication-title: J. Biomechanics doi: 10.1016/j.jbiomech.2020.1098202020.109820 – volume: 5 start-page: 1 year: 2018 ident: B18 article-title: Application of optitrack motion capture systems in human movement analysis: a systematic literature review publication-title: Recent Innovations Mechatronics doi: 10.17667/riim.2018.1/13 – start-page: 389 year: 2016 ident: B34 article-title: Neural learning enhanced teleoperation control of baxter robot using imu based motion capture – start-page: 1 volume-title: 2018 IEEE international symposium on medical measurements and applications (MeMeA) year: 2018 ident: B1 article-title: Evaluation of wearable imu performance for orientation estimation and motion tracking doi: 10.1109/MeMeA.2018.8438623 – volume: 70 start-page: 3082 year: 2023 ident: B24 article-title: Markerless motion tracking with noisy video and imu data publication-title: IEEE Trans. Biomed. Eng. doi: 10.1109/tbme.2023.3275775 – ident: B3 – start-page: 4193 volume-title: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems (IROS) year: 2016 ident: B29 article-title: AprilTag 2: efficient and robust fiducial detection – volume: 9 start-page: 8953 year: 2021 ident: B13 article-title: Real-time human motion capture based on wearable inertial sensor networks publication-title: IEEE Internet Things J. doi: 10.1109/jiot.2021.3119328 – start-page: 3811 ident: B33 article-title: Anytime, anywhere: human arm pose from smartwatch data for ubiquitous robot control and teleoperation – start-page: 1091 volume-title: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition year: 2024 ident: B12 article-title: Mocap everyone everywhere: lightweight motion capture with smartwatches and a head-mounted camera – start-page: 10212 volume-title: 2020 IEEE international conference on robotics and automation (ICRA) year: 2020 ident: B15 article-title: Hand-worn haptic interface for drone teleoperation doi: 10.1109/ICRA40945.2020.9196664 – volume: 44 start-page: 601 ident: B26 article-title: Humans interacting with multi-robot systems: a natural affect-based approach publication-title: Aut. Robots doi: 10.1007/s10514-019-09889-6 – volume-title: Proceedings of the 36th annual ACM symposium on user interface software and technology year: 2023 ident: B5 article-title: Smartposer: arm pose estimation with a smartphone and smartwatch using uwb and imu data doi: 10.1145/3586183.3606821 – volume: 37 start-page: 1 year: 2018 ident: B9 article-title: Deep inertial poser: learning to reconstruct human pose from sparse inertial measurements in real time publication-title: ACM Trans. Graph. (TOG) doi: 10.1145/3272127.3275108 – volume: 212 start-page: 103275 year: 2021 ident: B4 article-title: A review of 3d human pose estimation algorithms for markerless motion capture publication-title: Comput. Vis. Image Underst. doi: 10.1016/j.cviu.2021.103275 – volume: 2021 start-page: 6628320 year: 2021 ident: B8 article-title: Inertial-based human motion capture: a technical summary of current processing methodologies for spatiotemporal and kinematic measures publication-title: Appl. Bionics Biomechanics doi: 10.1155/2021/6628320 – start-page: 449 year: 2017 ident: B16 article-title: Real-time full-body motion capture from video and imus – start-page: 1 year: 2023 ident: B17 article-title: Imuposer: full-body pose estimation using imus in phones, watches, and earbuds – start-page: 6126 volume-title: 2015 37th annual international conference of the IEEE engineering in medicine and biology society (EMBC) year: 2015 ident: B11 article-title: Smartwatch-based driver alertness monitoring with wearable motion and physiological sensor – start-page: 670 year: 2012 ident: B20 article-title: Design and implementation of imu-based human arm motion capture system – ident: B31 – start-page: 155 year: 2017 ident: B21 article-title: Design and development of a real-time, low-cost imu based human motion capture system – start-page: 120 year: 2023 ident: B14 article-title: Real-time tracking of smartwatch orientation and location by multitask learning – volume: 1 start-page: 1 year: 2009 ident: B23 article-title: Xsens mvn: full 6dof human motion tracking using miniature inertial sensors publication-title: Xsens Motion Technol. BV, Tech. Rep. – volume: 39 start-page: 1706 year: 2023 ident: B2 article-title: Teleoperation of humanoid robots: a survey publication-title: IEEE Trans. Robotics doi: 10.1109/TRO.2023.3236952 – volume: 12 start-page: 1 year: 2023 ident: B22 article-title: Robotic vision for human-robot interaction and collaboration: a survey and systematic review publication-title: J. Hum.-Robot Interact. doi: 10.1145/3570731 – volume: 20 start-page: 13047 ident: B27 article-title: Wearable devices for the assessment of cognitive effort for human–robot interaction publication-title: IEEE Sensors J. doi: 10.1109/JSEN.2020.3001635 – start-page: 17800 volume-title: 2024 IEEE international conference on robotics and automation (ICRA) year: 2024 ident: B32 article-title: iRoCo: intuitive robot control from anywhere using a smartwatch doi: 10.1109/ICRA57147.2024.10610805 – start-page: 7152 volume-title: 2021 43rd annual international conference of the IEEE engineering in medicine and biology society (EMBC) year: 2021 ident: B30 article-title: Real-time limb motion tracking with a single imu sensor for physical therapy exercises – start-page: 1050 year: 2016 ident: B6 article-title: Dropout as a bayesian approximation: representing model uncertainty in deep learning – volume: 12 start-page: 5684 year: 2024 ident: B19 article-title: A decade of progress in human motion recognition: a comprehensive survey from 2010 to 2020 publication-title: IEEE Access doi: 10.1109/access.2024.3350338ACCESS.2024.3350338 – volume: 12 start-page: 1 year: 2023 ident: B28 article-title: Virtual, augmented, and mixed reality for human-robot interaction: a survey and virtual design element taxonomy publication-title: J. Hum.-Robot Interact. doi: 10.1145/3597623 – year: 2024 ident: B7 article-title: Analysis and perspectives on the ana avatar xprize competition publication-title: Int. J. Soc. Robotics doi: 10.1007/s12369-023-01095-w – start-page: 1965 volume-title: 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS) year: 2017 ident: B10 article-title: Human motion estimation on lie groups using imu measurements doi: 10.1109/IROS.2017.8206016 |
| SSID | ssj0001584377 |
| Score | 2.252448 |
| Snippet | We present WearMoCap, an open-source library to track the human pose from smartwatch sensor data and leveraging pose predictions for ubiquitous robot control.... |
| SourceID | doaj unpaywall proquest pubmed crossref |
| SourceType | Open Website Open Access Repository Aggregation Database Index Database |
| StartPage | 1478016 |
| SubjectTerms | drone control human-robot interaction motion capture smartwatch teleoperation wearables |
| SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LT9wwEB5VXFoOVUtpm1KQK_VGIzZ2HMfcyqoIVdpeCoKbNfEDkCBZdhMh_j3jJKy2UiV66C2K8ph8Y89848yMAb76ssydK2yaIVdpjqhTDI6nDrVVopK5LWPt8OxXcXKW_7yQF2tbfcWcsKE98ADcgZaiQl94V0zI4KKgYyyt8nnQ5ByzEK3vpNRrwdRQH1zmQqmhSoaiMH0QFk0Vcyd5TsZBkV0u_vBEfcP-v7HMTXjZ1XN8uMebmzXPc_wGXo-UkX0fRH0LL3y9BZtrjQTfwe9zGrCzZorzQ9anCN42jm6ZN0vP2gXauB7OiJ6yrrq-62gSd0tGkjYtG1PVWcx_v2TIlrcEyT3Z56ttODv-cTo9ScftElIrirxNCRoekFzRxCPGxvWafLFWWirLnY2d-mKRrc8wCEG0JFOelxnqSllrpfQT8R426qb2H4EJ5Yg4EPMLxLcU58i5C0FilhUF6ZwnsP8EnZkPXTEMRRMRaNMDbSLQZgQ6gaOI7urK2NG6P0F6NqOezXN6TuDLk24MzYD4WwNrT3AZkRHllDGOTeDDoLTVqwR9NHlgmcC3lRb_QeJP_0PiHXgVnzms13yGjXbR-V1iMG211w_WRxaB7Ys priority: 102 providerName: Directory of Open Access Journals |
| Title | WearMoCap: multimodal pose tracking for ubiquitous robot control using a smartwatch |
| URI | https://www.ncbi.nlm.nih.gov/pubmed/39831285 https://www.proquest.com/docview/3157551107 https://doi.org/10.3389/frobt.2024.1478016 https://doaj.org/article/953bae6ed60341a3ae6a8c7e4f97101f |
| UnpaywallVersion | publishedVersion |
| Volume | 11 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVAFT databaseName: Open Access Digital Library customDbUrl: eissn: 2296-9144 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001584377 issn: 2296-9144 databaseCode: KQ8 dateStart: 20140101 isFulltext: true titleUrlDefault: http://grweb.coalliance.org/oadl/oadl.html providerName: Colorado Alliance of Research Libraries – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 2296-9144 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001584377 issn: 2296-9144 databaseCode: DOA dateStart: 20140101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2296-9144 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001584377 issn: 2296-9144 databaseCode: M~E dateStart: 20140101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVAQN databaseName: PubMed Central customDbUrl: eissn: 2296-9144 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001584377 issn: 2296-9144 databaseCode: RPM dateStart: 20180101 isFulltext: true titleUrlDefault: https://www.ncbi.nlm.nih.gov/pmc/ providerName: National Library of Medicine |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB7B9gA98KaER2UkbpB242fMrVRUFVIrJFhRTtHEsQG1TZbdRBX8esZJuloQSOUYy5bj8Yznsz3zGeCFz3NZVdqlGXKTSkSbYqh4WqF1RpRKujzmDh8d68OZfHeiTkaanJgLs3Z_T5snuxsWTRlDHrkkmza0nOrrsKEV4e4JbMyO3-99jq_HcavJaqUcsmL-0fA3z9MT9P8NVW7Cja6e448LPDtb8zQHt4cni5Y9QWEMMDnd6dpyx_38g77xaoO4A7dGwMn2Bg25C9d8fQ8212gI78OHT6TuR80-zl-zPsDwvKmoybxZetYu0MXTdEbglnXlt-8dLQHdklFfTcvGQHcWo-e_MGTLc9LEC1rdvz6A2cHbj_uH6fjYQuqElm0q0POA5MimHjHS3lvy5NZYZRyvXOT5iym6PsMgBIGazHieZ2hL45xTyk_FQ5jUTe0fAROmIthBuDEQWjOcI-dVCAqzTGvSGJ7Ay8uJKOYDp0ZBe5EoqqIXVRFFVYyiSuBNnKtVzciH3ReQhIvRvAqrRIle-0pPyS0jDUZj7oyXwRKEykICzy9nuiD7iZciWHsSVyEyAqwq7oIT2BpUYNWVoEGT_1YJvFrpxBX--PH_VX8CN-PncK7zFCbtovPPCOm05XZ_QrA9Kvovhmf4vg |
| linkProvider | Unpaywall |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB7B9gA98C6EAjISN0jZ-Bn3ViqqCqkVEqwop2ji2BTRJstuogp-PeMkXS0IpHKMZcvxeMbz2Z75DPDC57msKu3SDLlJJaJNMVQ8rdA6I0olXR5zh4-O9eFMvjtRJyNNTsyFWbu_p82TfR0WTRlDHrkkmza0nOrrsKEV4e4JbMyO3-99jq_HcavJaqUcsmL-0fA3z9MT9P8NVW7Cja6e448LPDtb8zQHt4cni5Y9QWEMMPm207Xljvv5B33j1QZxB26NgJPtDRpyF675-h5srtEQ3ocPn0jdj5p9nO-yPsDwvKmoybxZetYu0MXTdEbglnXl1-8dLQHdklFfTcvGQHcWo-e_MGTLc9LEC1rdTx_A7ODtx_3DdHxsIXVCyzYV6HlAcmRTjxhp7y15cmusMo5XLvL8xRRdn2EQgkBNZjzPM7Slcc4p5adiCyZ1U_tHwISpCHYQbgyE1gznyHkVgsIs05o0hifw8nIiivnAqVHQXiSKquhFVURRFaOoEngT52pVM_Jh9wUk4WI0r8IqUaLXvtJTcstIg9GYO-NlsAShspDA88uZLsh-4qUI1p7EVYiMAKuKu-AEHg4qsOpK0KDJf6sEXq104gp__Pj_qm_Dzfg5nOs8gUm76PxTQjpt-WxU8V-3ZPfJ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=WearMoCap%3A+multimodal+pose+tracking+for+ubiquitous+robot+control+using+a+smartwatch&rft.jtitle=Frontiers+in+robotics+and+AI&rft.au=Weigend%2C+Fabian+C.&rft.au=Kumar%2C+Neelesh&rft.au=Aran%2C+Oya&rft.au=Ben+Amor%2C+Heni&rft.date=2024&rft.issn=2296-9144&rft.eissn=2296-9144&rft.volume=11&rft_id=info:doi/10.3389%2Ffrobt.2024.1478016&rft.externalDBID=n%2Fa&rft.externalDocID=10_3389_frobt_2024_1478016 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2296-9144&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2296-9144&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2296-9144&client=summon |