EEG emotion recognition based on data-driven signal auto-segmentation and feature fusion

Pattern recognition based on network connections has recently been applied to the brain-computer interface (BCI) research, offering new ideas for emotion recognition using Electroencephalogram (EEG) signal. However unified standards are currently lacking for selecting emotional signals in emotion re...

Full description

Saved in:
Bibliographic Details
Published inJournal of affective disorders Vol. 361; pp. 356 - 366
Main Authors Gao, Yunyuan, Zhu, Zehao, Fang, Feng, Zhang, Yingchun, Meng, Ming
Format Journal Article
LanguageEnglish
Published Netherlands Elsevier B.V 15.09.2024
Subjects
Online AccessGet full text
ISSN0165-0327
1573-2517
1573-2517
DOI10.1016/j.jad.2024.06.042

Cover

More Information
Summary:Pattern recognition based on network connections has recently been applied to the brain-computer interface (BCI) research, offering new ideas for emotion recognition using Electroencephalogram (EEG) signal. However unified standards are currently lacking for selecting emotional signals in emotion recognition research, and potential associations between activation differences in brain regions and network connectivity pattern are often being overlooked. To bridge this technical gap, a data-driven signal auto-segmentation and feature fusion algorithm (DASF) is proposed in this paper. First, the Phase Locking Value (PLV) method was used to construct the brain functional adjacency matrix of each subject, and the dynamic brain functional network across subjects was then constructed. Next, tucker decomposition was performed and the Grassmann distance of the connectivity submatrix was calculated. Subsequently, different brain network states were distinguished and signal segments under emotional states were automatically extract using data-driven methods. Then, tensor sparse representation was adopted on the intercepted EEG signals to effectively extract functional connections under different emotional states. Finally, power-distribution related features (differential entropy and energy feature) and brain functional connection features were effectively combined for classification using the support vector machines (SVM) classifier. The proposed method was validated on ERN and DEAP datasets. The single-feature emotion classification accuracy of 86.57 % and 87.74 % were achieved on valence and arousal dimensions, respectively. The accuracy of the proposed feature fusion method was achieved at 89.14 % and 89.65 %, accordingly, demonstrating an improvement in emotion recognition accuracy. The results demonstrated the superior classification performance of the proposed data-driven signal auto-segmentation and feature fusion algorithm in emotion recognition compared to state-of-the-art classification methods. •A data-driven approach to capturing signals when emotions are triggered. Network parameters for different subjects were obtained through PLV calculations, forming a tensor. Tucker decomposition was then used to detect the brain network state of EEG signals, automatically identifying EEG segments during emotional activation. This method effectively screens signals in the state of emotional arousal for improved emotion recognition.•Using tensor to extract brain functional connectivity patterns under emotional states. Through tensor sparse representation, the functional brain connectivity network can be obtained to represent the emotional excitation state, and then the network connection features can be extracted.•Fusion features offer richer compensatory information for emotion recognition by combining activation difference features with connectivity features.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0165-0327
1573-2517
1573-2517
DOI:10.1016/j.jad.2024.06.042