TY - JOUR
T1 - A multilevel approach for sequential inference on partially observed deterministic systems
AU - Jasra, Ajay
AU - Law, Kody J.H.
AU - Xu, Yi
N1 - Generated from Scopus record by KAUST IRTS on 2019-11-20
PY - 2019/1/1
Y1 - 2019/1/1
N2 - In this article we consider sequential inference on partially observed deterministic systems. Examples include: inference on the expected position of a dynamical system, with random initial position, or Bayesian static parameter inference for unobserved partial differential equations (PDEs), both associated to sequentially observed real data. Such statistical models are found in a wide variety of real applications, including weather prediction. In many practical scenarios one must discretize the system, but even under such discretization, it is not possible to compute the associated expected value (integral) required for inference. Such quantities are then approximated by Monte Carlo methods, and the associated cost to achieve a given level of error in this context can substantially be reduced by using multilevel Monte Carlo (MLMC). MLMC relies upon exact sampling of the model of interest, which is not always possible. We devise a sequential Monte Carlo (SMC) method, which does not require exact sampling, to leverage the MLMC method. We prove that for some models with n data points, that to achieve a mean square error (MSE) in estimation of O(ɛ2) (for some 0 < ɛ < 1) our MLSMC method has a cost of O(n2ɛ−2) versus an SMC method that just approximates the most precise discretiztion of O(n2ɛ−3). This is illustrated on two numerical examples.
AB - In this article we consider sequential inference on partially observed deterministic systems. Examples include: inference on the expected position of a dynamical system, with random initial position, or Bayesian static parameter inference for unobserved partial differential equations (PDEs), both associated to sequentially observed real data. Such statistical models are found in a wide variety of real applications, including weather prediction. In many practical scenarios one must discretize the system, but even under such discretization, it is not possible to compute the associated expected value (integral) required for inference. Such quantities are then approximated by Monte Carlo methods, and the associated cost to achieve a given level of error in this context can substantially be reduced by using multilevel Monte Carlo (MLMC). MLMC relies upon exact sampling of the model of interest, which is not always possible. We devise a sequential Monte Carlo (SMC) method, which does not require exact sampling, to leverage the MLMC method. We prove that for some models with n data points, that to achieve a mean square error (MSE) in estimation of O(ɛ2) (for some 0 < ɛ < 1) our MLSMC method has a cost of O(n2ɛ−2) versus an SMC method that just approximates the most precise discretiztion of O(n2ɛ−3). This is illustrated on two numerical examples.
UR - http://www.dl.begellhouse.com/journals/52034eb04b657aea,5a3895a14afb242f,6d73b999639a73f3.html
UR - http://www.scopus.com/inward/record.url?scp=85073424532&partnerID=8YFLogxK
U2 - 10.1615/Int.J.UncertaintyQuantification.2019027245
DO - 10.1615/Int.J.UncertaintyQuantification.2019027245
M3 - Article
SN - 2152-5099
VL - 9
JO - International Journal for Uncertainty Quantification
JF - International Journal for Uncertainty Quantification
IS - 4
ER -