Abstract
In this paper, we study the large scale constrained linear regression problem and propose a two-step preconditioning method, which is based on some recent developments on random projection, sketching techniques and convex optimization methods. Combining the method with (accelerated) mini-batch SGD, we can achieve an approximate solution with a time complexity lower than that of the state-of-the-art techniques for the low precision case. Our idea can also be extended to the high precision case, which gives an alternative implementation to the Iterative Hessian Sketch (IHS) method with significantly improved time complexity. Experiments on benchmark and synthetic datasets suggest that our methods indeed outperform existing ones considerably in both the low and high precision cases.
Original language | English (US) |
---|---|
Pages (from-to) | 280-296 |
Number of pages | 17 |
Journal | Neurocomputing |
Volume | 364 |
DOIs | |
State | Published - Oct 28 2019 |
Externally published | Yes |