Neural Methods for Amortized Inference

Andrew Zammit-Mangion, Matthew Sainsbury-Dale, Raphaël Huser

Research output: Contribution to journalReview articlepeer-review

3 Scopus citations

Abstract

Simulation-based methods for statistical inference have evolved dramatically over the past 50 years, keeping pace with technological advancements. The field is undergoing a new revolution as it embraces the representational capacity of neural networks, optimization libraries, and graphics processing units for learning complex mappings between data and inferential targets. The resulting tools are amortized, in the sense that, after an initial setup cost, they allow rapid inference through fast feed-forward operations. In this article we review recent progress in the context of point estimation, approximate Bayesian inference, summary-statistic construction, and likelihood approximation. We also cover software and include a simple illustration to showcase the wide array of tools available for amortized inference and the benefits they offer over Markov chain Monte Carlo methods. The article concludes with an overview of relevant topics and an outlook on future research directions.

Original languageEnglish (US)
Pages (from-to)311-335
Number of pages25
JournalAnnual Review of Statistics and Its Application
Volume12
Issue number1
DOIs
StatePublished - Mar 7 2025

Keywords

  • approximate Bayesian inference
  • likelihood approximation
  • likelihood-free inference
  • neural networks
  • simulation-based inference
  • variational Bayes

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Neural Methods for Amortized Inference'. Together they form a unique fingerprint.

Cite this