TY - GEN
T1 - Estimating Smooth GLM in Non-interactive Local Differential Privacy Model with Public Unlabeled Data
AU - Wang, Di
AU - Zhang, Huanyu
AU - Gaboardi, Marco
AU - Xu, Jinhui
N1 - KAUST Repository Item: Exported on 2023-07-04
Acknowledgements: Di Wang was support in part by the baseline funding of King Abdullah University of Science and Technology (KAUST). Huanyu Zhang supported was in part by the National Science Foundation (NSF) under Grant No. 1815893 and 1704443. Jinhui Xu was supported in part by the National Science Foundation (NSF) under Grant No. CCF-1716400 and IIS-1919492. Part of the work was done when Di Wang and Marco Gaboardi were visiting the Simons Institute of the Theory for Computing.
PY - 2021/1/1
Y1 - 2021/1/1
N2 - In this paper, we study the problem of estimating smooth Generalized Linear Models (GLM) in the Non-interactive Local Differential Privacy (NLDP) model. Different from its classical setting, our model allows the server to access some additional public but unlabeled data. By using Stein’s lemma and its variants, we first show that there is an (ε, δ)-NLDP algorithm for GLM (under some mild assumptions), if each data record is i.i.d sampled from some sub-Gaussian distribution with bounded `1-norm. Then with high probability, the sample complexity of the public and private data, for the algorithm to achieve an α estimation error (in `∞-norm), is O(p2α-2) and O(p2α-2ε-2), respectively, if α is not too small (i.e., α ≥ Ω(√1p)), where p is the dimensionality of the data. This is a significant improvement over the previously known exponential or quasi-polynomial in α-1, or exponential in p sample complexity of GLM with no public data. We then extend our idea to the non-linear regression problem and show a similar phenomenon for it. Finally, we demonstrate the effectiveness of our algorithms through experiments on both synthetic and real world datasets. To our best knowledge, this is the first paper showing the existence of efficient and effective algorithms for GLM and non-linear regression in the NLDP model with public unlabeled data.
AB - In this paper, we study the problem of estimating smooth Generalized Linear Models (GLM) in the Non-interactive Local Differential Privacy (NLDP) model. Different from its classical setting, our model allows the server to access some additional public but unlabeled data. By using Stein’s lemma and its variants, we first show that there is an (ε, δ)-NLDP algorithm for GLM (under some mild assumptions), if each data record is i.i.d sampled from some sub-Gaussian distribution with bounded `1-norm. Then with high probability, the sample complexity of the public and private data, for the algorithm to achieve an α estimation error (in `∞-norm), is O(p2α-2) and O(p2α-2ε-2), respectively, if α is not too small (i.e., α ≥ Ω(√1p)), where p is the dimensionality of the data. This is a significant improvement over the previously known exponential or quasi-polynomial in α-1, or exponential in p sample complexity of GLM with no public data. We then extend our idea to the non-linear regression problem and show a similar phenomenon for it. Finally, we demonstrate the effectiveness of our algorithms through experiments on both synthetic and real world datasets. To our best knowledge, this is the first paper showing the existence of efficient and effective algorithms for GLM and non-linear regression in the NLDP model with public unlabeled data.
UR - http://hdl.handle.net/10754/692739
UR - http://proceedings.mlr.press/v132/wang21a.html
UR - http://www.scopus.com/inward/record.url?scp=85162232535&partnerID=8YFLogxK
M3 - Conference contribution
SP - 1207
EP - 1213
BT - 32nd International Conference on Algorithmic Learning Theory, ALT 2021
PB - ML Research Press
ER -