Estimating Smooth GLM in Non-interactive Local Differential Privacy Model with Public Unlabeled Data

Di Wang, Huanyu Zhang, Marco Gaboardi, Jinhui Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Scopus citations


In this paper, we study the problem of estimating smooth Generalized Linear Models (GLM) in the Non-interactive Local Differential Privacy (NLDP) model. Different from its classical setting, our model allows the server to access some additional public but unlabeled data. By using Stein’s lemma and its variants, we first show that there is an (ε, δ)-NLDP algorithm for GLM (under some mild assumptions), if each data record is i.i.d sampled from some sub-Gaussian distribution with bounded `1-norm. Then with high probability, the sample complexity of the public and private data, for the algorithm to achieve an α estimation error (in `∞-norm), is O(p2α-2) and O(p2α-2ε-2), respectively, if α is not too small (i.e., α ≥ Ω(√1p)), where p is the dimensionality of the data. This is a significant improvement over the previously known exponential or quasi-polynomial in α-1, or exponential in p sample complexity of GLM with no public data. We then extend our idea to the non-linear regression problem and show a similar phenomenon for it. Finally, we demonstrate the effectiveness of our algorithms through experiments on both synthetic and real world datasets. To our best knowledge, this is the first paper showing the existence of efficient and effective algorithms for GLM and non-linear regression in the NLDP model with public unlabeled data.
Original languageEnglish (US)
Title of host publication32nd International Conference on Algorithmic Learning Theory, ALT 2021
PublisherML Research Press
Number of pages7
StatePublished - Jan 1 2021


Dive into the research topics of 'Estimating Smooth GLM in Non-interactive Local Differential Privacy Model with Public Unlabeled Data'. Together they form a unique fingerprint.

Cite this