TY - GEN
T1 - Variance Reduced Coordinate Descent with Acceleration: New Method With a Surprising Application to Finite-Sum Problems
AU - Hanzely, Filip
AU - Koyaley, Dmitry
AU - Richtarik, Peter
N1 - KAUST Repository Item: Exported on 2021-09-16
Acknowledgements: The authors would like to express their gratitude to Konstantin Mishchenko. In particular, Konstantin has introduced us to the product space objective (13) and at the same time, the idea that SAGA is a special case of SEGA was born during the discussion with him.
PY - 2020
Y1 - 2020
N2 - We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov’s momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha [1] is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective
AB - We propose an accelerated version of stochastic variance reduced coordinate descent – ASVRCD. As other variance reduced coordinate descent methods such as SEGA or SVRCD, our method can deal with problems that include a non-separable and non-smooth regularizer, while accessing a random block of partial derivatives in each iteration only. However, ASVRCD incorporates Nesterov’s momentum, which offers favorable iteration complexity guarantees over both SEGA and SVRCD. As a by-product of our theory, we show that a variant of Katyusha [1] is a specific case of ASVRCD, recovering the optimal oracle complexity for the finite sum objective
UR - http://hdl.handle.net/10754/666019
UR - https://arxiv.org/pdf/2002.04670
M3 - Conference contribution
BT - International Conference on Machine Learning (ICML)
PB - arXiv
ER -