A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication
For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and...
Saved in:
Published in | IEEE transactions on cybernetics Vol. 51; no. 12; pp. 5954 - 5968 |
---|---|
Main Authors | , , , , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.12.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 2168-2267 2168-2275 2168-2275 |
DOI | 10.1109/TCYB.2020.2974688 |
Cover
Summary: | For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and engaging bidirectional emotional communications between a social robot and a human user. User affect is detected using a unique combination of body language and vocal intonation, and multimodal classification is performed using a Bayesian Network. The Emotionally Expressive Robot utilizes the user's affect to determine its own emotional behavior via an innovative two-layer emotional model consisting of deliberative (hidden Markov model) and reactive (rule-based) layers. The proposed architecture has been implemented via a small humanoid robot to perform diet and fitness counseling during HRI. In order to evaluate the Emotionally Expressive Robot's effectiveness, a Neutral Robot that can detect user affects but lacks an emotional display, was also developed. A between-subjects HRI experiment was conducted with both types of robots. Extensive results have shown that both robots can effectively detect user affect during the real-time HRI. However, the Emotionally Expressive Robot can appropriately determine its own emotional response based on the situation at hand and, therefore, induce more user positive valence and less negative arousal than the Neutral Robot. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ISSN: | 2168-2267 2168-2275 2168-2275 |
DOI: | 10.1109/TCYB.2020.2974688 |