EEG Emotion Recognition Using AttGraph: A Multi-Dimensional Attention-Based Dynamic Graph Convolutional Network

Background/Objectives: Electroencephalogram (EEG) signals, which reflect brain activity, are widely used in emotion recognition. However, the variety of EEG features presents significant challenges in identifying key features, reducing redundancy, and simplifying the computational process. Methods:...

Full description

Saved in:
Bibliographic Details
Published inBrain sciences Vol. 15; no. 6; p. 615
Main Authors Zhang, Shuai, Chu, Chengxi, Zhang, Xin, Zhang, Xiu
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 07.06.2025
MDPI
Subjects
Online AccessGet full text
ISSN2076-3425
2076-3425
DOI10.3390/brainsci15060615

Cover

More Information
Summary:Background/Objectives: Electroencephalogram (EEG) signals, which reflect brain activity, are widely used in emotion recognition. However, the variety of EEG features presents significant challenges in identifying key features, reducing redundancy, and simplifying the computational process. Methods: To address these challenges, this paper proposes a multi-dimensional attention-based dynamic graph convolutional neural network (AttGraph) model. The model delves into the impact of different EEG features on emotion recognition by evaluating their sensitivity to emotional changes, providing richer and more accurate feature information. Results: Through the dynamic weighting of EEG features via a multi-dimensional attention convolution layer, the AttGraph method is able to precisely detect emotional changes and automatically choose the most discriminative features for emotion recognition tasks. This approach significantly improves the model’s recognition accuracy and robustness. Finally, subject-independent and subject-dependent experiments were conducted on two public datasets. Conclusions: Through comparisons and analyses with existing methods, the proposed AttGraph method demonstrated outstanding performances in emotion recognition tasks, with stronger generalization ability and adaptability.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2076-3425
2076-3425
DOI:10.3390/brainsci15060615