Accuracy First: Selecting a Differential Privacy Level for Accuracy-Constrained ERM

Traditional approaches to differential privacy assume a fixed privacy requirement ε for a computation, and attempt to maximize the accuracy of the computation subject to the privacy constraint. As differential privacy is increasingly deployed in practical settings, it may often be that there is inst...

Full description

Saved in:
Bibliographic Details
Published inThe journal of privacy and confidentiality Vol. 9; no. 2
Main Authors Wu, Steven, Roth, Aaron, Ligett, Katrina, Waggoner, Bo, Neel, Seth
Format Journal Article
LanguageEnglish
Published Labor Dynamics Institute 23.10.2019
Subjects
Online AccessGet full text
ISSN2575-8527
2575-8527
DOI10.29012/jpc.682

Cover

More Information
Summary:Traditional approaches to differential privacy assume a fixed privacy requirement ε for a computation, and attempt to maximize the accuracy of the computation subject to the privacy constraint. As differential privacy is increasingly deployed in practical settings, it may often be that there is instead a fixed accuracy requirement for a given computation and the data analyst would like to maximize the privacy of the computation subject to the accuracy constraint. This raises the question of how to find and run a maximally private empirical risk minimizer subject to a given accuracy requirement. We propose a general “noise reduction” framework that can apply to a variety of private empirical risk minimization (ERM) algorithms, using them to “search” the space of privacy levels to find the empirically strongest one that meets the accuracy constraint, and incurring only logarithmic overhead in the number of privacy levels searched. The privacy analysis of our algorithm leads naturally to a version of differential privacy where the privacy parameters are dependent on the data, which we term ex-post privacy, and which is related to the recently introduced notion of privacy odometers. We also give an ex-post privacy analysis of the classical AboveThreshold privacy tool, modifying it to allow for queries chosen depending on the database. Finally, we apply our approach to two common objective functions, regularized linear and logistic regression, and empirically compare our noise reduction methods to (i) inverting the theoretical utility guarantees of standard private ERM algorithms and (ii) a stronger, empirical baseline based on binary search.
ISSN:2575-8527
2575-8527
DOI:10.29012/jpc.682