%0 Conference Paper %B Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS ‘22) %D 2022 %T Hypothesis testing for differentially private linear regression %A Daniel Alabi %A Salil Vadhan %X In this work, we design differentially private hypothesis tests for the following problems in the general linear model: testing a linear relationship and testing for the presence of mixtures. The majority of our hypothesis tests are based on differentially private versions of the F-statistic for the general linear model framework, which are uniformly most powerful unbiased in the nonprivate setting. We also present other tests for these problems, one of which is based on the differentially private nonparametric tests of Couch, Kazan, Shi, Bray, and Groce (CCS 2019), which is especially suited for the small dataset regime. We show that the differentially private Fstatistic converges to the asymptotic distribution of its non-private counterpart. As a corollary, the statistical power of the differentially private F-statistic converges to the statistical power of the non-private F-statistic. Through a suite of Monte Carlo based experiments, we show that our tests achieve desired significance levels and have a high power that approaches the power of the non-private tests as we increase sample sizes or the privacy-loss parameter. We also show when our tests outperform existing methods in the literature. %B Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS ‘22) %G eng %U https://arxiv.org/pdf/2206.14449.pdf