Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

Grigory Malinovsky, Konstantin Mishchenko, Peter Richtárik

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

We present a theoretical study of server-side optimization in federated learning. Our results are the first to show that the widely popular heuristic of scaling the client updates with an extra parameter is very useful in the context of Federated Averaging (FedAvg) with local passes over the client data. Each local pass is performed without replacement using Random Reshuffling, which is a key reason we can show improved complexities. In particular, we prove that whenever the local stepsizes are small, and the update direction is given by FedAvg in conjunction with Random Reshuffling over all clients, one can take a big leap in the obtained direction and improve rates for convex, strongly convex, and non-convex objectives. In particular, in non-convex regime we get an enhancement of the rate of convergence from <scr>O</scr> (ϵ-3) to <scr>O</scr> (ϵ-2). This result is new even for Random Reshuffling performed on a single node. In contrast, if the local stepsizes are large, we prove that the noise of client sampling can be controlled by using a small server-side stepsize. To the best of our knowledge, this is the first time that local steps provably help to overcome the communication bottleneck. Together, our results on the advantage of large and small server-side stepsizes give a formal justification for the practice of adaptive server-side optimization in federated learning. Moreover, we consider a variant of our algorithm that supports partial client participation, which makes the method more practical.

Original languageEnglish (US)
Title of host publicationDistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning
PublisherAssociation for Computing Machinery, Inc
Pages85-104
Number of pages20
ISBN (Electronic)9798400704475
DOIs
StatePublished - Dec 8 2023
Event4th International Workshop on Distributed Machine Learning, DistributedML 2023 - Paris, France
Duration: Dec 8 2023 → …

Publication series

NameDistributedML 2023 - Proceedings of the 4th International Workshop on Distributed Machine Learning

Conference

Conference4th International Workshop on Distributed Machine Learning, DistributedML 2023
Country/TerritoryFrance
CityParis
Period12/8/23 → …

Keywords

  • distributed optimization
  • federated learning

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization'. Together they form a unique fingerprint.

Cite this