Piggybacking Robots Human-Robot Overtrust in University Dormitory Security

Can overtrust in robots compromise physical security? We conducted a series of experiments in which a robot positioned outside a secure-access student dormitory asked passersby to assist it to gain access. We found individual participants were as likely to assist the robot in exiting the dormitory (...

Full description

Saved in:
Bibliographic Details
Published in2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI pp. 426 - 434
Main Authors Booth, Serena, Tompkin, James, Pfister, Hanspeter, Waldo, Jim, Gajos, Krzysztof, Nagpal, Radhika
Format Conference Proceeding
LanguageEnglish
Published New York, NY, USA ACM 06.03.2017
SeriesACM Conferences
Subjects
Online AccessGet full text
ISBN9781450343367
1450343368
ISSN2167-2148
DOI10.1145/2909824.3020211

Cover

More Information
Summary:Can overtrust in robots compromise physical security? We conducted a series of experiments in which a robot positioned outside a secure-access student dormitory asked passersby to assist it to gain access. We found individual participants were as likely to assist the robot in exiting the dormitory (40% assistance rate, 4/10 individuals) as in entering (19%, 3/16 individuals). Groups of people were more likely than individuals to assist the robot in entering (71%, 10/14 groups). When the robot was disguised as a food delivery agent for the fictional start-up Robot Grub, individuals were more likely to assist the robot in entering (76%, 16/21 individuals). Lastly, we found participants who identified the robot as a bomb threat demonstrated a trend toward assisting the robot (87%, 7/8 individuals, 6/7 groups). Thus, we demonstrate that overtrust---the unfounded belief that the robot does not intend to deceive or carry risk---can represent a significant threat to physical security at a university dormitory.
ISBN:9781450343367
1450343368
ISSN:2167-2148
DOI:10.1145/2909824.3020211