Enhanced dynamic hand gesture recognition for finger disabilities using deep learning and an optimized Otsu threshold method

Hand gestures serve as a powerful means of communication, capable of conveying extensive information across various public health domains, including medicine and education. The process of hand gesture recognition involves the use of mathematical algorithms to identify human gestures and finds applic...

Full description

Saved in:
Bibliographic Details
Published inEngineering Research Express Vol. 7; no. 1; pp. 15228 - 15247
Main Authors Kadhim, Malik Kareem, Der, Chen Soong, Chai Phing, Chen
Format Journal Article
LanguageEnglish
Published IOP Publishing 31.03.2025
Subjects
Online AccessGet full text
ISSN2631-8695
2631-8695
DOI10.1088/2631-8695/ada72d

Cover

More Information
Summary:Hand gestures serve as a powerful means of communication, capable of conveying extensive information across various public health domains, including medicine and education. The process of hand gesture recognition involves the use of mathematical algorithms to identify human gestures and finds applications in areas such as communication for the deaf, human-computer interaction, intelligent driving, and virtual reality. This study introduces a robust method aimed at recognizing dynamic hand gestures, particularly for individuals with finger disabilities. The approach begins with segmenting hand gestures from intricate backgrounds using an advanced Otsu segmentation algorithm, while also integrating motion data from RGB video sequences. Hand gestures are transformed into texture and contour features, which are utilized as input for a hybrid model that merges a convolutional neural network (CNN) with a recurrent neural network (RNN). The model employs Inception-v3 for feature extraction complemented by an LSTM layer for classification. The focus of the study is on recognizing six dynamic gestures, with particular emphasis on ‘scroll right’ and ‘scroll down’ due to their high accuracy in recognition. The model demonstrated an average precision of 84.34% across all gestures, achieving 87.57% for gestures involving finger impairments. These results highlight the model’s effectiveness in practical applications for dynamic hand gesture recognition.
Bibliography:ERX-106519.R2
ISSN:2631-8695
2631-8695
DOI:10.1088/2631-8695/ada72d