MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization

Laurent Condat, Peter Richtárik

Research output: Contribution to conferencePaperpeer-review

5 Scopus citations

Abstract

We propose a generic variance-reduced algorithm, which we call MUltiple RANdomized Algorithm (MURANA), for minimizing a sum of several smooth functions plus a regularizer, in a sequential or distributed manner. Our method is formulated with general stochastic operators, which allow us to model various strategies for reducing the computational complexity. For example, MURANA supports sparse activation of the gradients, and also reduction of the communication load via compression of the update vectors. This versatility allows MURANA to cover many existing randomization mechanisms within a unified framework, which also makes it possible to design new methods as special cases.

Original languageEnglish (US)
Pages81-96
Number of pages16
StatePublished - 2022
Event3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022 - Beijing, China
Duration: Aug 15 2022Aug 17 2022

Conference

Conference3rd Annual Conference on Mathematical and Scientific Machine Learning, MSML 2022
Country/TerritoryChina
CityBeijing
Period08/15/2208/17/22

Keywords

  • communication
  • compression
  • convex optimization
  • distributed optimization
  • randomized algorithm
  • sampling
  • stochastic gradient
  • variance reduction

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'MURANA: A Generic Framework for Stochastic Variance-Reduced Optimization'. Together they form a unique fingerprint.

Cite this