Effects of Robot Sound on Auditory Localization in Human-Robot Collaboration

Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produc...

Full description

Saved in:
Bibliographic Details
Published in2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI) pp. 434 - 442
Main Authors Cha, Elizabeth, Fitter, Naomi T., Kim, Yunkyung, Fong, Terrence, Matarić, Maja J.
Format Conference Proceeding
LanguageEnglish
Published New York, NY, USA ACM 26.02.2018
SeriesACM Conferences
Subjects
Online AccessGet full text
ISBN9781450349536
1450349536
ISSN2167-2148
DOI10.1145/3171221.3171285

Cover

More Information
Summary:Auditory cues facilitate situational awareness by enabling humans to infer what is happening in the nearby environment. Unlike humans, many robots do not continuously produce perceivable state-expressive sounds. In this work, we propose the use of iconic auditory signals that mimic the sounds produced by a robot»s operations. In contrast to artificial sounds (e.g., beeps and whistles), these signals are primarily functional, providing information about the robot»s actions and state. We analyze the effects of two variations of robot sound, tonal and broadband, on auditory localization during a human-robot collaboration task. Results from 24 participants show that both signals significantly improve auditory localization, but the broadband variation is preferred by participants. We then present a computational formulation for auditory signaling and apply it to the problem of auditory localization using a human-subjects data collection with 18 participants to learn optimal signaling policies.
ISBN:9781450349536
1450349536
ISSN:2167-2148
DOI:10.1145/3171221.3171285