TY - JOUR
T1 - Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
AU - Condat, Laurent Pierre
AU - Kitahara, Daichi
AU - Contreras, Andrés
AU - Hirabayashi, Akira
N1 - KAUST Repository Item: Exported on 2023-07-17
Acknowledgements: The first author did part of this work during a stay at Ritsumeikan University in 2018, hosted by the second and fourth authors, thanks to a fellowship from the Japanese Society for the Promotion of Science (JSPS), L17565. The third author contributed to this work during his visit to the first author at GIPSA-lab, Grenoble, France, in 2019. He was supported by the CMM-ANID PIA grant AFB170001.
PY - 2023/5/9
Y1 - 2023/5/9
N2 - Convex nonsmooth optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as proximal splitting algorithms is particularly adequate: they consist of simple operations, handling the terms in the objective function separately. In this overview, we demystify a selection of recent proximal splitting algorithms: we present them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics. Along the way, we easily derive new variants of the algorithms and revisit existing convergence results, extending the parameter ranges in several cases. In particular, we emphasize that when the smooth term in the objective function is quadratic, e.g., for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Such larger values are usually beneficial for the convergence speed in practice.
AB - Convex nonsmooth optimization problems, whose solutions live in very high dimensional spaces, have become ubiquitous. To solve them, the class of first-order algorithms known as proximal splitting algorithms is particularly adequate: they consist of simple operations, handling the terms in the objective function separately. In this overview, we demystify a selection of recent proximal splitting algorithms: we present them within a unified framework, which consists in applying splitting methods for monotone inclusions in primal-dual product spaces, with well-chosen metrics. Along the way, we easily derive new variants of the algorithms and revisit existing convergence results, extending the parameter ranges in several cases. In particular, we emphasize that when the smooth term in the objective function is quadratic, e.g., for least-squares problems, convergence is guaranteed with larger values of the relaxation parameter than previously known. Such larger values are usually beneficial for the convergence speed in practice.
UR - http://hdl.handle.net/10754/660732
UR - https://epubs.siam.org/doi/10.1137/20M1379344
UR - http://www.scopus.com/inward/record.url?scp=85163557767&partnerID=8YFLogxK
U2 - 10.1137/20M1379344
DO - 10.1137/20M1379344
M3 - Article
SN - 0036-1445
VL - 65
SP - 375
EP - 435
JO - SIAM Review
JF - SIAM Review
IS - 2
ER -