A new line of work [6, 9, 15, 2] demonstrates how differential privacy [8] can be used as a mathematical tool for guaranteeing generalization in adaptive data analysis. Specifically, if a differentially private analysis is applied on a sample S of i.i.d. examples to select a lowsensitivity function f , then w.h.p. f (S) is close to its expectation, although f is being chosen based on the data. Very recently, Steinke and Ullman [16] observed that these generalization guarantees can be used for proving concentration bounds in the non-adaptive setting, where the low-sensitivity function is fixed beforehand. In particular, they obtain alternative proofs for classical concentration bounds for low-sensitivity functions, such as the Chernoff bound and McDiarmid’s Inequality. In this work, we set out to examine the situation for functions with high-sensitivity, for which differential privacy does not imply generalization guarantees under adaptive analysis. We show that differential privacy can be used to prove concentration bounds for such functions in the non-adaptive setting.

# Computing over Distributed Sensitive Data: Publications

Private Incremental Regression.” in the ACM SIGMOD/PODS Conference (PODS 2017).

. 2017. “
A Semantic Account of Metric Preservation.” Symposium on the Principle of Programming Languages, ACM. arXiv PageAbstract

. 1/2017. “
Differentially Private Bayesian Programming .” 23rd ACM Conference on Computer and Communications Security, CCS.

. 10/2016. “
Computer-Aided Verification in Mechanism Design.” Conference on Internet and Economics, WINE .

. 12/2016. “
Advanced Probabilistic Couplings for Differential Privacy .” 23rd ACM Conference on Computer and Communications Security, CCS.

. 10/2016. “
Generic Attacks on Secure Outsourced Databases.” 23rd ACM Conference on Computer and Communications Security.Abstract

. 10/2016. “