TY - JOUR
T1 - A Monte Carlo Metropolis-Hastings Algorithm for Sampling from Distributions with Intractable Normalizing Constants
AU - Liang, Faming
AU - Jin, Ick-Hoon
N1 - KAUST Repository Item: Exported on 2020-10-01
Acknowledged KAUST grant number(s): KUS-C1-016-04
Acknowledgements: We thank the editor, associate editor, and two referees for their comments, which have led to significant improvement of this letter. F: L.'s research was partially supported by grants from the National Science Foundation (DMS-1007457 and DMS-1106494) and the award (KUS-C1-016-04) made by King Abdullah University of Science and Technology (KAUST).
This publication acknowledges KAUST support, but has no KAUST affiliated authors.
PY - 2013/8
Y1 - 2013/8
N2 - Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.
AB - Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals. © 2013 Massachusetts Institute of Technology.
UR - http://hdl.handle.net/10754/597314
UR - http://www.mitpressjournals.org/doi/10.1162/NECO_a_00466
UR - http://www.scopus.com/inward/record.url?scp=84880992075&partnerID=8YFLogxK
U2 - 10.1162/NECO_a_00466
DO - 10.1162/NECO_a_00466
M3 - Article
C2 - 23607562
SN - 0899-7667
VL - 25
SP - 2199
EP - 2234
JO - Neural Computation
JF - Neural Computation
IS - 8
ER -