Can Algorithm Knowledge Stop Women from Being Targeted by Algorithm Bias? The New Digital Divide on Weibo

Algorithm knowledge of users plays a crucial role in avoiding them from algorithm bias in recommendation systems. Gender of users has been found to correlate with algorithm bias, but also leaving behind a question of whether this relationship can be described by algorithm knowledge. By using Weibo a...

Full description

Saved in:
Bibliographic Details
Published inJournal of broadcasting & electronic media Vol. 67; no. 3; pp. 397 - 422
Main Authors Zhang, Yang, Chen, Huashan
Format Journal Article
LanguageEnglish
Published Philadelphia Routledge 27.05.2023
Routledge, Taylor & Francis Group
Subjects
Online AccessGet full text
ISSN0883-8151
1550-6878
DOI10.1080/08838151.2023.2218955

Cover

More Information
Summary:Algorithm knowledge of users plays a crucial role in avoiding them from algorithm bias in recommendation systems. Gender of users has been found to correlate with algorithm bias, but also leaving behind a question of whether this relationship can be described by algorithm knowledge. By using Weibo as an example system, we clarify the aforementioned question from a digital divide theory perspective. We combine a traditional method (questionnaire) with a deep learning computational method to explain algorithm bias in two sequential studies. Our findings suggest that algorithm knowledge solely works for men while fails to protect women. Who users follow helps determine what information they are exposed to on Weibo, and this renders female users' algorithm knowledge useless. This work provides a valuable perspective on algorithm bias: we view algorithm bias as a new digital divide and contribute to the understanding of gender differences by applying the digital divide perspective. Methodologically, we contribute by integrating traditional and computational methods to explain algorithm bias from a folk theory perspective.
Bibliography:ObjectType-Essay-1
SourceType-Scholarly Journals-1
content type line 14
ISSN:0883-8151
1550-6878
DOI:10.1080/08838151.2023.2218955