Deep Learning Network and Renyi-entropy Based Fusion Model for Emotion Recognition Using Multimodal Signals
Emotion recognition is a significant research topic for interactive intelligence system with the wide range of applications in different tasks, like education, social media analysis, and customer service. It is the process of perceiving user's emotional response automatically to the multimedia...
Saved in:
| Published in | International journal of modern education and computer science Vol. 14; no. 4; pp. 67 - 84 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Hong Kong
Modern Education and Computer Science Press
01.08.2022
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2075-0161 2075-017X 2075-017X |
| DOI | 10.5815/ijmecs.2022.04.06 |
Cover
| Summary: | Emotion recognition is a significant research topic for interactive intelligence system with the wide range of applications in different tasks, like education, social media analysis, and customer service. It is the process of perceiving user's emotional response automatically to the multimedia information by means of implicit explanation. With initiation of speech recognition and the computer vision, research on emotion recognition with speech and facial expression modality has gained more popularity in recent decades. Due to non-linear polarity of signals, emotion recognition results a challenging task. To achieve facial emotion recognition using multimodal signals, an effective Bat Rider Optimization Algorithm (BROA)-based deep learning method is proposed in this research. However, the proposed optimization algorithm named BROA is derived by integrating Bat Algorithm (BA) with Rider Optimization Algorithm (ROA), respectively. Here, the multimodal signals include face image, EEG signals, and physiological signals such that the features extracted from these modalities are employed for the process of emotion recognition. The proposed method achieves better performance against exiting methods by acquiring maximum accuracy of 0.8794, and minimum FAR and minimum FRR of 0.1757 and 0.1806. |
|---|---|
| Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
| ISSN: | 2075-0161 2075-017X 2075-017X |
| DOI: | 10.5815/ijmecs.2022.04.06 |