Using Big Data for Emotionally Intelligent Mobile Services through Multi-Modal Emotion Recognition

Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is sign...

Full description

Saved in:
Bibliographic Details
Published inInclusive Smart Cities and e-Health pp. 127 - 138
Main Authors Baimbetov, Yerzhan, Khalil, Ismail, Steinbauer, Matthias, Anderst-Kotsis, Gabriele
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2015
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783319193113
3319193112
ISSN0302-9743
1611-3349
DOI10.1007/978-3-319-19312-0_11

Cover

More Information
Summary:Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is significantly important to be aware of the emotional context. Thanks to the advancement of mobile technology, it is feasible to collect contextual data. In this paper, the authors descibe the first step to extract insightful emotional information using cloud-based Big Data infrastructure. Relevant aspects of emotion recognition and challenges that come with multi-modal emotion recognition are also discussed.
ISBN:9783319193113
3319193112
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-19312-0_11