Local Privacy and Statistical Minimax Rates

Working under local differential privacy-a model of privacy in which data remains private even from the statistician or learner-we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mut...

Full description

Saved in:
Bibliographic Details
Published inAnnual Symposium on Foundations of Computer Science pp. 429 - 438
Main Authors Duchi, John C., Jordan, Michael I., Wainwright, Martin J.
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.10.2013
Subjects
Online AccessGet full text
ISSN0272-5428
DOI10.1109/FOCS.2013.53

Cover

More Information
Summary:Working under local differential privacy-a model of privacy in which data remains private even from the statistician or learner-we study the tradeoff between privacy guarantees and the utility of the resulting statistical estimators. We prove bounds on information-theoretic quantities, including mutual information and Kullback-Leibler divergence, that influence estimation rates as a function of the amount of privacy preserved. When combined with minimax techniques such as Le Cam's and Fano's methods, these inequalities allow for a precise characterization of statistical rates under local privacy constraints. In this paper, we provide a treatment of two canonical problem families: mean estimation in location family models and convex risk minimization. For these families, we provide lower and upper bounds for estimation of population quantities that match up to constant factors, giving privacy-preserving mechanisms and computationally efficient estimators that achieve the bounds.
ISSN:0272-5428
DOI:10.1109/FOCS.2013.53