A fast stochastic riemannian eigensolver

Zhiqiang Xu, Yiping Ke, Xin Gao

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations


We propose a fast stochastic Riemannian gradient eigensolver for a real and symmetric matrix, and prove its local, eigengap-dependent and linear convergence. The fast convergence is brought by deploying the variance reduction technique which was originally developed for the Euclidean strongly convex problems. In this paper, this technique is generalized to Riemannian manifolds for solving the geodesically non-convex problem of finding a group of top eigenvectors of such a matrix. We first propose the general variance reduction form of the stochastic Riemannian gradient, giving rise to the stochastic variance reduced Riemannian gradient method (SVRRG). It turns out that the operation of vector transport is necessary in addition to using Riemannian gradients and retraction operations. We then specialize it to the problem in question resulting in our SVRRGEIGS algorithm. We are among the first to propose and analyze the generalization of the stochastic variance reduced gradient (SVRG) to Riemannian manifolds. As an extension of the linearly convergent VR-PCA, it is significant and nontrivial for the proposed algorithm to theoretically achieve a further speedup and empirically make a difference, due to our respect to the inherent geometry of the problem.

Original languageEnglish (US)
StatePublished - 2017
Event33rd Conference on Uncertainty in Artificial Intelligence, UAI 2017 - Sydney, Australia
Duration: Aug 11 2017Aug 15 2017


Conference33rd Conference on Uncertainty in Artificial Intelligence, UAI 2017

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'A fast stochastic riemannian eigensolver'. Together they form a unique fingerprint.

Cite this