High Dimensional Inference with Random Maximum A-Posteriori Perturbations

Tamir Hazan, Francesco Orabona, Anand D. Sarwate, Subhransu Maji, Tommi S. Jaakkola

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations

Abstract

This paper presents a new approach, called perturb-max, for high-dimensional statistical inference in graphical models that is based on applying random perturbations followed by optimization. This framework injects randomness into maximum a-posteriori (MAP) predictors by randomly perturbing the potential function for the input. A classic result from extreme value statistics asserts that perturb-max operations generate unbiased samples from the Gibbs distribution using high-dimensional perturbations. Unfortunately, the computational cost of generating so many high-dimensional random variables can be prohibitive. However, when the perturbations are of low dimension, sampling the perturb-max prediction is as efficient as MAP optimization. This paper shows that the expected value of perturb-max inference with low dimensional perturbations can be used sequentially to generate unbiased samples from the Gibbs distribution. Furthermore the expected value of the maximal perturbations is a natural bound on the entropy of such perturb-max models. A measure concentration result for perturb-max values shows that the deviation of their sampled average from its expectation decays exponentially in the number of samples, allowing effective approximation of the expectation.
Original languageEnglish (US)
Title of host publicationIEEE Transactions on Information Theory
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages6539-6560
Number of pages22
DOIs
StatePublished - Oct 1 2019
Externally publishedYes

Fingerprint

Dive into the research topics of 'High Dimensional Inference with Random Maximum A-Posteriori Perturbations'. Together they form a unique fingerprint.

Cite this