## Abstract

In this work we study the parallel coordinate descent method (PCDM) proposed by Richtárik and Takáč [Parallel coordinate descent methods for big data optimization, Math. Program. Ser. A (2015), pp. 1–52] for minimizing a regularized convex function. We adopt elements from the work of Lu and Xiao [On the complexity analysis of randomized block-coordinate descent methods, Math. Program. Ser. A 152(1–2) (2015), pp. 615–642], and combine them with several new insights, to obtain sharper iteration complexity results for PCDM than those presented in [Richtárik and Takáč, Parallel coordinate descent methods for big data optimization, Math. Program. Ser. A (2015), pp. 1–52]. Moreover, we show that PCDM is monotonic in expectation, which was not confirmed in [Richtárik and Takáč, Parallel coordinate descent methods for big data optimization, Math. Program. Ser. A (2015), pp. 1–52], and we also derive the first high probability iteration complexity result where the initial levelset is unbounded.

Original language | English (US) |
---|---|

Pages (from-to) | 372-395 |

Number of pages | 24 |

Journal | Optimization Methods and Software |

Volume | 33 |

Issue number | 2 |

DOIs | |

State | Published - Mar 4 2018 |

## Keywords

- block coordinate descent
- composite minimization
- convex optimization
- iteration complexity
- monotonic algorithm
- parallelization
- rate of convergence
- unbounded levelset

## ASJC Scopus subject areas

- Software
- Control and Optimization
- Applied Mathematics