Two-hand on-skin gesture recognition: a dataset and classification network for enhanced human–computer interaction

Gestural interaction is an increasingly utilized method for controlling devices and environments. Despite the growing research on gesture recognition, datasets tailored specifically for two-hand on-skin interaction remain scarce. This paper presents the two-hand on-skin (THOS) dataset, comprising 30...

Full description

Saved in:
Bibliographic Details
Published inThe Visual computer Vol. 41; no. 13; pp. 11641 - 11656
Main Authors Keskin, Ege, Özcan, Oğuzhan, Yemez, Yücel
Format Journal Article
LanguageEnglish
Published Heidelberg Springer Nature B.V 01.10.2025
Subjects
Online AccessGet full text
ISSN0178-2789
1432-2315
DOI10.1007/s00371-025-04125-y

Cover

More Information
Summary:Gestural interaction is an increasingly utilized method for controlling devices and environments. Despite the growing research on gesture recognition, datasets tailored specifically for two-hand on-skin interaction remain scarce. This paper presents the two-hand on-skin (THOS) dataset, comprising 3096 labeled samples and 92,880 frames from three subjects across nine gesture classes. The dataset is based on hand-specific on-skin (HSoS) gestures, which involve direct contact between both hands. We also introduce THOSnet, a hybrid model leveraging transformer decoders and bi-directional long short-term memory (BiLSTM) for gesture classification. Evaluations show that THOSnet outperforms standalone transformer encoders and BiLSTMs, achieving an average test accuracy of 79.31% on the THOS dataset. Our contributions aim to bridge the gap between dynamic gesture recognition and on-skin interaction research, offering valuable resources for developing and testing advanced gesture recognition models. By open-sourcing the dataset and code through https://github.com/ege621/thos-dataset, we facilitate further research and reproducibility in this area.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-025-04125-y