A Reputation Game Simulation: Emergent Social Phenomena from Information Theory

Reputation is a central element of social communications, be it with human or artificial intelligence (AI), and as such can be the primary target of malicious communication strategies. There is already a vast amount of literature on trust networks and their dynamics using Bayesian principles and inv...

Full description

Saved in:
Bibliographic Details
Published inAnnalen der Physik Vol. 534; no. 5
Main Authors Enßlin, Torsten, Kainz, Viktoria, Bœhm, Céline
Format Journal Article
LanguageEnglish
Published Weinheim Wiley Subscription Services, Inc 01.05.2022
Subjects
Online AccessGet full text
ISSN0003-3804
1521-3889
DOI10.1002/andp.202100277

Cover

More Information
Summary:Reputation is a central element of social communications, be it with human or artificial intelligence (AI), and as such can be the primary target of malicious communication strategies. There is already a vast amount of literature on trust networks and their dynamics using Bayesian principles and involving Theory of Mind models. An issue for these simulations can be the amount of information that can be stored and managed using discretizing variables and hard thresholds. Here a novel approach to the way information is updated that accounts for knowledge uncertainty and is closer to reality is proposed. Agents use information compression techniques to capture their complex environment and store it in their finite memories. The loss of information that results from this leads to emergent phenomena, such as echo chambers, self‐deception, deception symbiosis, and freezing of group opinions. Various malicious strategies of agents are studied for their impact on group sociology, like sycophancy, egocentricity, pathological lying, and aggressiveness. Our set‐up already provides insights into social interactions and can be used to investigate the effects of various communication strategies and find ways to counteract malicious ones. Eventually this work should help to safeguard the design of non‐abusive AI systems. Sociophysical simulations of a reputation game are presented. The participating agents exchange honest or dishonest statements, which are about third agents or themselves, while trying to be perceived as being honest. Different social phases can be identified, in particular when malicious communication strategies are used.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0003-3804
1521-3889
DOI:10.1002/andp.202100277