Neural Decoding of Chinese Sign Language With Machine Learning for Brain-Computer Interfaces

Limb motion decoding is an important part of brain-computer interface (BCI) research. Among the limb motion, sign language not only contains rich semantic information and abundant maneuverable actions but also provides different executable commands. However, many researchers focus on decoding the gr...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 29; pp. 2721 - 2732
Main Authors Wang, Pengpai, Zhou, Yueying, Li, Zhongnian, Huang, Shuo, Zhang, Daoqiang
Format Journal Article
LanguageEnglish
Published United States IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN1534-4320
1558-0210
1558-0210
DOI10.1109/TNSRE.2021.3137340

Cover

More Information
Summary:Limb motion decoding is an important part of brain-computer interface (BCI) research. Among the limb motion, sign language not only contains rich semantic information and abundant maneuverable actions but also provides different executable commands. However, many researchers focus on decoding the gross motor skills, such as the decoding of ordinary motor imagery or simple upper limb movements. Here we explored the neural features and decoding of Chinese sign language from electroencephalograph (EEG) signal with motor imagery and motor execution. Sign language not only contains rich semantic information, but also has abundant maneuverable actions, and provides us with more different executable commands. In this paper, twenty subjects were instructed to perform movement execution and movement imagery based on Chinese sign language. Seven classifiers are employed to classify the selected features of sign language EEG. L1 regularization is used to learn and select features that contain more information from the mean, power spectral density, sample entropy, and brain network connectivity. The best average classification accuracy of the classifier is 89.90% (imagery sign language is 83.40%). These results have shown the feasibility of decoding between different sign languages. The source location reveals that the neural circuits involved in sign language are related to the visual contact area and the pre-movement area. Experimental evaluation shows that the proposed decoding strategy based on sign language can obtain outstanding classification results, which provides a certain reference value for the subsequent research of limb decoding based on sign language.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1534-4320
1558-0210
1558-0210
DOI:10.1109/TNSRE.2021.3137340