TY - JOUR
T1 - MAP estimators and their consistency in Bayesian nonparametric inverse problems
AU - Dashti, M.
AU - Law, K. J H
AU - Stuart, A. M.
AU - Voss, J.
N1 - KAUST Repository Item: Exported on 2020-10-01
PY - 2013/9/2
Y1 - 2013/9/2
N2 - We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.
AB - We consider the inverse problem of estimating an unknown function u from noisy measurements y of a known, possibly nonlinear, map applied to u. We adopt a Bayesian approach to the problem and work in a setting where the prior measure is specified as a Gaussian random field μ0. We work under a natural set of conditions on the likelihood which implies the existence of a well-posed posterior measure, μy. Under these conditions, we show that the maximum a posteriori (MAP) estimator is well defined as the minimizer of an Onsager-Machlup functional defined on the Cameron-Martin space of the prior; thus, we link a problem in probability with a problem in the calculus of variations. We then consider the case where the observational noise vanishes and establish a form of Bayesian posterior consistency for the MAP estimator. We also prove a similar result for the case where the observation of can be repeated as many times as desired with independent identically distributed noise. The theory is illustrated with examples from an inverse problem for the Navier-Stokes equation, motivated by problems arising in weather forecasting, and from the theory of conditioned diffusions, motivated by problems arising in molecular dynamics. © 2013 IOP Publishing Ltd.
UR - http://hdl.handle.net/10754/594280
UR - https://iopscience.iop.org/article/10.1088/0266-5611/29/9/095017
UR - http://www.scopus.com/inward/record.url?scp=84884126479&partnerID=8YFLogxK
U2 - 10.1088/0266-5611/29/9/095017
DO - 10.1088/0266-5611/29/9/095017
M3 - Article
SN - 0266-5611
VL - 29
SP - 095017
JO - Inverse Problems
JF - Inverse Problems
IS - 9
ER -