A Progressive Multi-Domain Adaptation Network With Reinforced Self-Constructed Graphs for Cross-Subject EEG-Based Emotion and Consciousness Recognition

Electroencephalogram (EEG)-based emotion recognition is a vital component in brain-computer interface applications. However, it faces two significant challenges: 1) extracting domain-invariant features while effectively preserving emotion-related information, and 2) aligning the joint probability di...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on neural systems and rehabilitation engineering Vol. 33; pp. 3498 - 3510
Main Authors Chen, Rongtao, Xie, Chuwen, Zhang, Jiahui, You, Qi, Pan, Jiahui
Format Journal Article
LanguageEnglish
Published United States IEEE 01.01.2025
Subjects
Online AccessGet full text
ISSN1534-4320
1558-0210
1558-0210
DOI10.1109/TNSRE.2025.3603190

Cover

More Information
Summary:Electroencephalogram (EEG)-based emotion recognition is a vital component in brain-computer interface applications. However, it faces two significant challenges: 1) extracting domain-invariant features while effectively preserving emotion-related information, and 2) aligning the joint probability distributions of data across different individuals. To address these challenges, we propose a progressive multi-domain adaptation network with reinforced self-constructed graphs. Specifically, we introduce EEG-CutMix to construct unlabeled mixed-domain data, facilitating the transition between source and target domains. Additionally, a reinforced self-constructed graphs module is employed to extract domain-invariant features. Finally, a progressive multi-domain adaptation framework is constructed to smoothly align the data distributions across individuals. Experiments on cross-subject datasets demonstrate that our model achieves state-of-the-art performance on the SEED and SEED-IV datasets, with accuracies of 97.03% <inline-formula> <tex-math notation="LaTeX">\pm ~1.65 </tex-math></inline-formula>% and 88.18% <inline-formula> <tex-math notation="LaTeX">\pm ~4.55 </tex-math></inline-formula>%, respectively. Furthermore, tests on a self-recorded dataset, comprising ten healthy subjects and twelve patients with disorders of consciousness (DOC), show that our model achieves a mean accuracy of 86.65% <inline-formula> <tex-math notation="LaTeX">\pm ~2.28 </tex-math></inline-formula>% in healthy subjects. Notably, it successfully applies to DOC patients, with four subjects achieving emotion recognition accuracy exceeding 70%. These results validate the effectiveness of our model in EEG emotion recognition and highlight its potential for assessing consciousness levels in DOC patients. The source code for the proposed model is available at GitHub-seizeall/mycode.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1534-4320
1558-0210
1558-0210
DOI:10.1109/TNSRE.2025.3603190