TY - GEN
T1 - Unifying Framework for Accelerated Randomized Methods in Convex Optimization
AU - Dvurechensky, Pavel
AU - Gasnikov, Alexander
AU - Tyurin, Alexander
AU - Zholobov, Vladimir
N1 - KAUST Repository Item: Exported on 2023-09-05
Acknowledgements: The authors are very grateful to Yu. Nesterov and V. Spokoiny for fruitful discussions. Our interest to this field was initiated by the paper [48]. The research was supported by the Ministry of Science and Higher Education of the Russian Federation (Goszadaniye) No. 075-00337-20-03, project No. 0714-2020-0005.
PY - 2023/7/17
Y1 - 2023/7/17
N2 - In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
AB - In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on accelerated random block-coordinate descent, accelerated random directional search, accelerated random derivative-free method and, using our framework, provide their versions for problems with inexact oracle information. Our contribution also includes accelerated random block-coordinate descent with inexact oracle and entropy proximal setup as well as derivative-free version of this method. Moreover, we present an extension of our framework for strongly convex optimization problems. We also discuss an extension for the case of inexact model of the objective function.
UR - http://hdl.handle.net/10754/694062
UR - https://link.springer.com/10.1007/978-3-031-30114-8_15
UR - http://www.scopus.com/inward/record.url?scp=85169059462&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-30114-8_15
DO - 10.1007/978-3-031-30114-8_15
M3 - Conference contribution
SN - 9783031301131
SP - 511
EP - 561
BT - Foundations of Modern Statistics
PB - Springer International Publishing
ER -