Robots as Malevolent Moral Agents: Harmful Behavior Results in Dehumanization, Not Anthropomorphism

A robot's decision to harm a person is sometimes considered to be the ultimate proof of it gaining a human‐like mind. Here, we contrasted predictions about attribution of mental capacities from moral typecasting theory, with the denial of agency from dehumanization literature. Experiments 1 and...

Full description

Saved in:
Bibliographic Details
Published inCognitive science Vol. 44; no. 7; pp. e12872 - n/a
Main Authors Swiderska, Aleksandra, Küster, Dennis
Format Journal Article
LanguageEnglish
Published Hoboken Wiley Subscription Services, Inc 01.07.2020
Subjects
Online AccessGet full text
ISSN0364-0213
1551-6709
1551-6709
DOI10.1111/cogs.12872

Cover

More Information
Summary:A robot's decision to harm a person is sometimes considered to be the ultimate proof of it gaining a human‐like mind. Here, we contrasted predictions about attribution of mental capacities from moral typecasting theory, with the denial of agency from dehumanization literature. Experiments 1 and 2 investigated mind perception for intentionally and accidentally harmful robotic agents based on text and image vignettes. Experiment 3 disambiguated agent intention (malevolent and benevolent), and additionally varied the type of agent (robotic and human) using short computer‐generated animations. Harmful robotic agents were consistently imbued with mental states to a lower degree than benevolent agents, supporting the dehumanization account. Further results revealed that a human moral patient appeared to suffer less when depicted with a robotic agent than with another human. The findings suggest that future robots may become subject to human‐like dehumanization mechanisms, which challenges the established beliefs about anthropomorphism in the domain of moral interactions.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:0364-0213
1551-6709
1551-6709
DOI:10.1111/cogs.12872