Faster constrained linear regression via two-step preconditioning

Di Wang, Jinhui Xu

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


In this paper, we study the large scale constrained linear regression problem and propose a two-step preconditioning method, which is based on some recent developments on random projection, sketching techniques and convex optimization methods. Combining the method with (accelerated) mini-batch SGD, we can achieve an approximate solution with a time complexity lower than that of the state-of-the-art techniques for the low precision case. Our idea can also be extended to the high precision case, which gives an alternative implementation to the Iterative Hessian Sketch (IHS) method with significantly improved time complexity. Experiments on benchmark and synthetic datasets suggest that our methods indeed outperform existing ones considerably in both the low and high precision cases.
Original languageEnglish (US)
Pages (from-to)280-296
Number of pages17
StatePublished - Oct 28 2019
Externally publishedYes


Dive into the research topics of 'Faster constrained linear regression via two-step preconditioning'. Together they form a unique fingerprint.

Cite this