A novel neuroevolution model for emg-based hand gesture classification

Classification of hand gestures from multichannel surface electromyography (sEMG) has been widely explored for the control of robotic prostheses. Several deep-learning algorithms have been utilized for this task with diverse levels of performance. A special type of genetic algorithm, Neuroevolution...

Full description

Saved in:
Bibliographic Details
Published inNeural computing & applications Vol. 35; no. 14; pp. 10621 - 10635
Main Authors Dweiri, Yazan, Hajjar, Yumna, Hatahet, Ola
Format Journal Article
LanguageEnglish
Published London Springer London 01.05.2023
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN0941-0643
1433-3058
DOI10.1007/s00521-023-08253-1

Cover

More Information
Summary:Classification of hand gestures from multichannel surface electromyography (sEMG) has been widely explored for the control of robotic prostheses. Several deep-learning algorithms have been utilized for this task with diverse levels of performance. A special type of genetic algorithm, Neuroevolution of Augmenting Topologies (NEAT), has favorable properties to be exploited for this task, especially the minimalistic initial structure and optimizing the topology along and weights of the evolved network. In this paper, we proposed a novel NEAT-based model that coherently evolves neural networks with Gated Recurrent Units and employed it for sEMG-based hand gesture classification. The algorithm was assessed in classifying 9 gestures from eight subjects (NinaPro Database 2) using eight independently trained networks using 150 ms non-overlapping decision windows. The trained networks yielded a mean classification accuracy of 88.76% (3.85%). Separate classification of gesture transition yielded an overall accuracy of 84% and transition class recall of 93.3%. The proposed algorithm was shown to utilize a small data set to evolve a classifier capable of expanding the number of independent control signals for real-time myoelectric control of powered upper limb prosthesis, translating the user’s intent into intuitive control of prosthesis with high degrees of freedom.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0941-0643
1433-3058
DOI:10.1007/s00521-023-08253-1