How Algorithms Promote Self-Radicalization: Audit of TikTok’s Algorithm Using a Reverse Engineering Method

Algorithmic radicalization is the idea that algorithms used by social media platforms push people down digital “rabbit holes” by framing personal online activity. Algorithms control what people see and when they see it and learn from their past activities. As such, people gradually and subconsciousl...

Full description

Saved in:
Bibliographic Details
Published inSocial science computer review Vol. 42; no. 4; pp. 1020 - 1040
Main Authors Shin, Donghee, Jitkajornwanich, Kulsawasd
Format Journal Article
LanguageEnglish
Published Los Angeles, CA SAGE Publications 01.08.2024
SAGE PUBLICATIONS, INC
Subjects
Online AccessGet full text
ISSN0894-4393
1552-8286
DOI10.1177/08944393231225547

Cover

More Information
Summary:Algorithmic radicalization is the idea that algorithms used by social media platforms push people down digital “rabbit holes” by framing personal online activity. Algorithms control what people see and when they see it and learn from their past activities. As such, people gradually and subconsciously adopt the ideas presented to them by the rabbit hole down which they have been pushed. In this study, TikTok’s role in fostering radicalized ideology is examined to offer a critical analysis of the state of radicalism and extremism on platforms. This study conducted an algorithm audit of the role of radicalizing information in social media by examining how TikTok’s algorithms are being used to radicalize, polarize, and spread extremism and societal instability. The results revealed that the pathways through which users access far-right content are manifold and that a large portion of the content can be ascribed to platform recommendations through radicalization pipelines. Algorithms are not simple tools that offer personalized services but rather contributors to radicalism, societal violence, and polarization. Such personalization processes have been instrumental in how artificial intelligence (AI) has been deployed, designed, and used to the detrimental outcomes that it has generated. Thus, the generation and adoption of extreme content on TikTok are, by and large, not only a reflection of user inputs and interactions with the platform but also the platform’s ability to slot users into specific categories and reinforce their ideas.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0894-4393
1552-8286
DOI:10.1177/08944393231225547