A transfer learning framework based on motor imagery rehabilitation for stroke

Deep learning networks have been successfully applied to transfer functions so that the models can be adapted from the source domain to different target domains. This study uses multiple convolutional neural networks to decode the electroencephalogram (EEG) of stroke patients to design effective mot...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 11; no. 1; pp. 19783 - 9
Main Authors Xu, Fangzhou, Miao, Yunjing, Sun, Yanan, Guo, Dongju, Xu, Jiali, Wang, Yuandong, Li, Jincheng, Li, Han, Dong, Gege, Rong, Fenqi, Leng, Jiancai, Zhang, Yang
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 05.10.2021
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text
ISSN2045-2322
2045-2322
DOI10.1038/s41598-021-99114-1

Cover

More Information
Summary:Deep learning networks have been successfully applied to transfer functions so that the models can be adapted from the source domain to different target domains. This study uses multiple convolutional neural networks to decode the electroencephalogram (EEG) of stroke patients to design effective motor imagery (MI) brain-computer interface (BCI) system. This study has introduced ‘fine-tune’ to transfer model parameters and reduced training time. The performance of the proposed framework is evaluated by the abilities of the models for two-class MI recognition. The results show that the best framework is the combination of the EEGNet and ‘fine-tune’ transferred model. The average classification accuracy of the proposed model for 11 subjects is 66.36%, and the algorithm complexity is much lower than other models.These good performance indicate that the EEGNet model has great potential for MI stroke rehabilitation based on BCI system. It also successfully demonstrated the efficiency of transfer learning for improving the performance of EEG-based stroke rehabilitation for the BCI system.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-021-99114-1