PowerGest: Self-Powered Gesture Recognition for Command Input and Robotic Manipulation

As human-computer interaction (HCI) advances, gesture recognition has emerged as a transformative technology for human-computer interaction. Traditional methods, often camera or glove-based, are restricted by various environmental conditions and user-specific demands, highlighting the need for more...

Full description

Saved in:
Bibliographic Details
Published inProceedings - International Conference on Parallel and Distributed Systems pp. 528 - 535
Main Authors Li, Jiarong, Xu, Qinghao, Xu, Zhancong, Ge, Changshuo, Ruan, Liguang, Liang, Xiaojun, Ding, Wenbo, Gui, Weihua, Zhang, Xiao-Ping
Format Conference Proceeding
LanguageEnglish
Published IEEE 10.10.2024
Subjects
Online AccessGet full text
ISSN2690-5965
DOI10.1109/ICPADS63350.2024.00075

Cover

More Information
Summary:As human-computer interaction (HCI) advances, gesture recognition has emerged as a transformative technology for human-computer interaction. Traditional methods, often camera or glove-based, are restricted by various environmental conditions and user-specific demands, highlighting the need for more universal, non-intrusive, and sustainable solutions. Addressing this, we present PowerGest, a self-powered gesture recognition system based on a solar cell array. This innovative system leverages the dual functionalities of solar cells: energy harvesting and gesture sensing, providing an alternative to conventional methods. It integrates a designed low-powered data acquisition chip with a wireless transmission module and a user-friendly interface. PowerGest employs a series of signal processing methods and utilizes several machine learning algorithms, achieving over 97% accuracy for both numeric input and activity control gesture recognition tasks. With its broad applications in robotic control, text input, and more, PowerGest contributes to a more sustainable and intuitive HCI experience. Project demo: https://drive.google.com/drive/folders/10KEul8PAfvTUomi0JvZ8u411JyScCXQI?usp=sharing.
ISSN:2690-5965
DOI:10.1109/ICPADS63350.2024.00075