Abstract
We consider nonconvex stochastic optimization problems in the asynchronous centralized distributed setup where the communication times from workers to a server can not be ignored, and the computation and communication times are potentially different for all workers. Using an unbiassed compression technique, we develop a new method-Shadowheart SGD-that provably improves the time complexities of all previous centralized methods. Moreover, we show that the time complexity of Shadowheart SGD is optimal in the family of centralized methods with compressed communication. We also consider the bidirectional setup, where broadcasting from the server to the workers is non-negligible, and develop a corresponding method.
Original language | English (US) |
---|---|
State | Published - 2024 |
Event | 38th Conference on Neural Information Processing Systems, NeurIPS 2024 - Vancouver, Canada Duration: Dec 9 2024 → Dec 15 2024 |
Conference
Conference | 38th Conference on Neural Information Processing Systems, NeurIPS 2024 |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 12/9/24 → 12/15/24 |
ASJC Scopus subject areas
- Computer Networks and Communications
- Information Systems
- Signal Processing