Abstract
We propose a fast stochastic Riemannian gradient eigensolver for a real and symmetric matrix, and prove its local, eigengap-dependent and linear convergence. The fast convergence is brought by deploying the variance reduction technique which was originally developed for the Euclidean strongly convex problems. In this paper, this technique is generalized to Riemannian manifolds for solving the geodesically non-convex problem of finding a group of top eigenvectors of such a matrix. We first propose the general variance reduction form of the stochastic Riemannian gradient, giving rise to the stochastic variance reduced Riemannian gradient method (SVRRG). It turns out that the operation of vector transport is necessary in addition to using Riemannian gradients and retraction operations. We then specialize it to the problem in question resulting in our SVRRGEIGS algorithm. We are among the first to propose and analyze the generalization of the stochastic variance reduced gradient (SVRG) to Riemannian manifolds. As an extension of the linearly convergent VR-PCA, it is significant and nontrivial for the proposed algorithm to theoretically achieve a further speedup and empirically make a difference, due to our respect to the inherent geometry of the problem.
Original language | English (US) |
---|---|
State | Published - 2017 |
Event | 33rd Conference on Uncertainty in Artificial Intelligence, UAI 2017 - Sydney, Australia Duration: Aug 11 2017 → Aug 15 2017 |
Conference
Conference | 33rd Conference on Uncertainty in Artificial Intelligence, UAI 2017 |
---|---|
Country/Territory | Australia |
City | Sydney |
Period | 08/11/17 → 08/15/17 |
ASJC Scopus subject areas
- Artificial Intelligence