Private least absolute deviations with heavy-tailed data

Di Wang*, Jinhui Xu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

We study the problem of Differentially Private Stochastic Convex Optimization (DPSCO) with heavy-tailed data. Specifically, we focus on the problem of Least Absolute Deviations, i.e., ℓ1-norm linear regression, in the ϵ-DP model. While most previous work focuses on the case where the loss function is Lipschitz, in this paper we only need to assume the variates have bounded moments. Firstly, we study the case where the ℓ2 norm of data has a bounded second-order moment. We propose an algorithm that is based on the exponential mechanism and show that it is possible to achieve an upper bound of O˜([Formula presented]) (with high probability). Next, we relax the assumption to bounded θ-th order moment with some θ∈(1,2) and show that it is possible to achieve an upper bound of O˜(([Formula presented]). Our algorithms can also be extended to more relaxed cases where only each coordinate of the data has bounded moments, and we can get an upper bound of O˜([Formula presented]) in the second and θ-th moment case respectively.

Original languageEnglish (US)
Article number115071
JournalTheoretical Computer Science
Volume1030
DOIs
StatePublished - Mar 13 2025

Keywords

  • Differential privacy
  • Regression
  • Robust estimation

ASJC Scopus subject areas

  • Theoretical Computer Science
  • General Computer Science

Fingerprint

Dive into the research topics of 'Private least absolute deviations with heavy-tailed data'. Together they form a unique fingerprint.

Cite this