Covering the Sensitive Subjects to Protect Personal Privacy in Personalized Recommendation

Personalized recommendation has demonstrated its effectiveness in improving the problem of information overload on the Internet. However, evidences show that due to the concerns of personal privacy, users' reluctance to disclose their personal information has become a major barrier for the deve...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on services computing Vol. 11; no. 3; pp. 493 - 506
Main Authors Wu, Zongda, Li, Guiling, Liu, Qi, Xu, Guandong, Chen, Enhong
Format Journal Article
LanguageEnglish
Published IEEE 01.05.2018
Subjects
Online AccessGet full text
ISSN1939-1374
2372-0204
DOI10.1109/TSC.2016.2575825

Cover

More Information
Summary:Personalized recommendation has demonstrated its effectiveness in improving the problem of information overload on the Internet. However, evidences show that due to the concerns of personal privacy, users' reluctance to disclose their personal information has become a major barrier for the development of personalized recommendation. In this paper, we propose to generate a group of fake preference profiles, so as to cover up the user sensitive subjects, and thus protect user personal privacy in personalized recommendation. First, we present a client-based framework for user privacy protection, which requires not only no change to existing recommendation algorithms, but also no compromise to the recommendation accuracy. Second, based on the framework, we introduce a privacy protection model, which formulates the two requirements that ideal fake preference profiles should satisfy: (1) the similarity of feature distribution, which measures the effectiveness of fake preference profiles to hide a genuine user preference profile; and (2) the exposure degree of sensitive subjects, which measures the effectiveness of fake preference profiles to cover up the sensitive subjects. Finally, based on a subject repository of product classification, we present an implementation algorithm to well meet the privacy protection model. Both theoretical analysis and experimental evaluation demonstrate the effectiveness of our proposed approach.
ISSN:1939-1374
2372-0204
DOI:10.1109/TSC.2016.2575825