Publications by Year: 2017

2017

A new line of work [6, 9, 15, 2] demonstrates how differential privacy [8] can be used as a mathematical tool for guaranteeing generalization in adaptive data analysis. Specifically, if a differentially private analysis is applied on a sample S of i.i.d. examples to select a lowsensitivity function f , then w.h.p. f (S) is close to its expectation, although f is being chosen based on the data. Very recently, Steinke and Ullman [16] observed that these generalization guarantees can be used for proving concentration bounds in the non-adaptive setting, where the low-sensitivity function is fixed beforehand. In particular, they obtain alternative proofs for classical concentration bounds for low-sensitivity functions, such as the Chernoff bound and McDiarmid’s Inequality. In this work, we set out to examine the situation for functions with high-sensitivity, for which differential privacy does not imply generalization guarantees under adaptive analysis. We show that differential privacy can be used to prove concentration bounds for such functions in the non-adaptive setting.

PDF
Kobbi Nissim, Thomas Steinke, Alexandra Wood, Mark Bun, Marco Gaboardi, David O'Brien, and Salil Vadhan. 3/2017. Differential Privacy: A Primer for a Non-technical Audience (Preliminary Version). Cambridge, MA: a product of the "Bridging Privacy Definitions" working group, part of the Privacy Tools for Sharing Research Data project at Harvard University. Abstract

This document is a primer on differential privacy, which is a formal mathematical framework for guaranteeing privacy protection when analyzing or releasing statistical data. Recently emerging from the theoretical computer science literature, differential privacy is now in initial stages of implementation and use in various academic, industry, and government settings. Using intuitive illustrations and limited mathematical formalism, this document provides an introduction to differential privacy for non-technical practitioners, who are increasingly tasked with making decisions with respect to differential privacy as it grows more widespread in use. In particular, the examples in this document illustrate ways in which social scientists can conceptualize the guarantees provided by differential privacy with respect to the decisions they make when managing personal data about research subjects and informing them about the privacy protection they will be afforded. 

 

PDF
E. Cicek, G. Barthe, M. Gaboardi, D. Garg, and J. Hoffmann. 1/2017. “Relational Cost Analysis.” Symposium on the Principle of Programming Languages, ACM.
A. Azevedo de Amorim, M. Gaboardi, J. Hsu, S. Katsumata, and I. Cherigui. 1/2017. “A Semantic Account of Metric Preservation.” Symposium on the Principle of Programming Languages, ACM.
Mark Bun, Thomas Steinke, and Jonathan Ullman. 2017. “Make Up Your Mind: The Price of Online Queries in Differential Privacy..” Proceedings of the Twenty-Eighth Annual ACM-SIAM Symposium on Discrete Algorithms (SODA). arXiv Version
PDF
S. Kasiviswanathan, K. Nissim, and H. Jin. 2017. “Private Incremental Regression.” PODS.