Multi-Modal Emotion Recognition Using EEG and Eye Tracking Features

Multi-modal emotion recognition from various human physiological indicators has emerged as a large topic of interest, including the use of EEG, ECG, GSR and Eye Tracking features. This work introduced a simple CNN based multi-modal EEG and Eye Tracking emotion recognition model for the SEED V datase...

Full description

Saved in:
Bibliographic Details
Published in2024 46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) Vol. 2024; pp. 1 - 5
Main Authors Iacono, Paolo, Khan, Naimul
Format Conference Proceeding Journal Article
LanguageEnglish
Published United States IEEE 01.07.2024
Subjects
Online AccessGet full text
ISSN2694-0604
DOI10.1109/EMBC53108.2024.10781843

Cover

More Information
Summary:Multi-modal emotion recognition from various human physiological indicators has emerged as a large topic of interest, including the use of EEG, ECG, GSR and Eye Tracking features. This work introduced a simple CNN based multi-modal EEG and Eye Tracking emotion recognition model for the SEED V dataset. In contrast to other works on the SEED V dataset, different Differential Entropy time windows were tested for EEG feature extraction. EEG signals were arranged in a 2D image format to preserve spatial relationships between electrode placements on patients during the trials. The proposed model with a 1 second processing window for EEG features achieved state of the art results in Leave One Subject Out Validation, with a mean accuracy of 0.935 ± 0.038 on the SEED V dataset. A noticeable improvement was noted over the same multi-modal model using a 4 second processing window for EEG features, highlighting the importance of smaller time windows for EEG feature processing in emotion recognition problems.
ISSN:2694-0604
DOI:10.1109/EMBC53108.2024.10781843