SDN A: Stochastic dual Newton ascent for empirical risk minimization

Zheng Qu, Peter Richtárik, Martin Takáč, Olivier Fereoq

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

19 Scopus citations

Abstract

We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent. SDNA is capable of utilizing all local curvature information contained in the examples, which leads to striking improvements in both theory and practice - sometimes by orders of magnitude. In the special case when an L2-regularizer is used in the primal, the dual problem is a concave quadratic maximization problem plus a separable term. In this regime, SDNA in each step solves a proximal subproblem involving a random principal submatrix of the Hessian of the quadratic function; whence the name of the method.

Original languageEnglish (US)
Title of host publication33rd International Conference on Machine Learning, ICML 2016
EditorsKilian Q. Weinberger, Maria Florina Balcan
PublisherInternational Machine Learning Society (IMLS)
Pages2707-2725
Number of pages19
ISBN (Electronic)9781510829008
StatePublished - 2016
Externally publishedYes
Event33rd International Conference on Machine Learning, ICML 2016 - New York City, United States
Duration: Jun 19 2016Jun 24 2016

Publication series

Name33rd International Conference on Machine Learning, ICML 2016
Volume4

Other

Other33rd International Conference on Machine Learning, ICML 2016
Country/TerritoryUnited States
CityNew York City
Period06/19/1606/24/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'SDN A: Stochastic dual Newton ascent for empirical risk minimization'. Together they form a unique fingerprint.

Cite this