Emotion recognition based on fusion of long short-term memory networks and SVMs

This paper proposes a multimodal fusion emotion recognition method based on Dempster-Shafer evidence theory, which includes electroencephalogram (EEG) and electrocardiogram (ECG). For EEG, we use the SVM classifier to classify features, and for ECG, we establish the corresponding Bi-directional Long...

Full description

Saved in:
Bibliographic Details
Published inDigital signal processing Vol. 117; p. 103153
Main Authors Chen, Tian, Yin, Hongfang, Yuan, Xiaohui, Gu, Yu, Ren, Fuji, Sun, Xiao
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.10.2021
Subjects
Online AccessGet full text
ISSN1051-2004
1095-4333
DOI10.1016/j.dsp.2021.103153

Cover

More Information
Summary:This paper proposes a multimodal fusion emotion recognition method based on Dempster-Shafer evidence theory, which includes electroencephalogram (EEG) and electrocardiogram (ECG). For EEG, we use the SVM classifier to classify features, and for ECG, we establish the corresponding Bi-directional Long Short-Term Memory network emotion recognition structure, which is fused with EEG classification results through the evidence theory. We selected 25 video clips with five emotions (happy, relaxed, angry, sad, and disgusted), and a total of 20 subjects participated in our emotional experiment. The experimental results prove that the performance of the multi-modal fusion model proposed in this paper is superior to the single-modal emotion recognition model. In the Arousal and Valance dimensions, the average accuracy is improved by 2.64% and 2.75% compared with the EEG signal-based emotion recognition model. Compared with the emotion recognition model based on the ECG signal, the accuracy is improved by 7.37% and 8.73%.
ISSN:1051-2004
1095-4333
DOI:10.1016/j.dsp.2021.103153