Using Big Data for Emotionally Intelligent Mobile Services through Multi-Modal Emotion Recognition
Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is sign...
Saved in:
Published in | Inclusive Smart Cities and e-Health pp. 127 - 138 |
---|---|
Main Authors | , , , |
Format | Book Chapter |
Language | English |
Published |
Cham
Springer International Publishing
2015
|
Series | Lecture Notes in Computer Science |
Subjects | |
Online Access | Get full text |
ISBN | 9783319193113 3319193112 |
ISSN | 0302-9743 1611-3349 |
DOI | 10.1007/978-3-319-19312-0_11 |
Cover
Summary: | Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is significantly important to be aware of the emotional context. Thanks to the advancement of mobile technology, it is feasible to collect contextual data. In this paper, the authors descibe the first step to extract insightful emotional information using cloud-based Big Data infrastructure. Relevant aspects of emotion recognition and challenges that come with multi-modal emotion recognition are also discussed. |
---|---|
ISBN: | 9783319193113 3319193112 |
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-319-19312-0_11 |