A Convergence Theory for SVGD in the Population Limit under Talagrand's Inequality T1

Adil Salim*, Lukang Sun, Peter Richtárik

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

9 Scopus citations

Abstract

Stein Variational Gradient Descent (SVGD) is an algorithm for sampling from a target density which is known up to a multiplicative constant. Although SVGD is a popular algorithm in practice, its theoretical study is limited to a few recent works. We study the convergence of SVGD in the population limit, (i.e., with an infinite number of particles) to sample from a non-logconcave target distribution satisfying Talagrand's inequality T1. We first establish the convergence of the algorithm. Then, we establish a dimension-dependent complexity bound in terms of the Kernelized Stein Discrepancy (KSD). Unlike existing works, we do not assume that the KSD is bounded along the trajectory of the algorithm. Our approach relies on interpreting SVGD as a gradient descent over a space of probability measures.

Original languageEnglish (US)
Pages19139-19152
Number of pages14
StatePublished - 2022
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: Jul 17 2022Jul 23 2022

Conference

Conference39th International Conference on Machine Learning, ICML 2022
Country/TerritoryUnited States
CityBaltimore
Period07/17/2207/23/22

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'A Convergence Theory for SVGD in the Population Limit under Talagrand's Inequality T1'. Together they form a unique fingerprint.

Cite this