A multilevel approach for sequential inference on partially observed deterministic systems

Ajay Jasra, Kody J.H. Law, Yi Xu

Research output: Contribution to journalArticlepeer-review

Abstract

In this article we consider sequential inference on partially observed deterministic systems. Examples include: inference on the expected position of a dynamical system, with random initial position, or Bayesian static parameter inference for unobserved partial differential equations (PDEs), both associated to sequentially observed real data. Such statistical models are found in a wide variety of real applications, including weather prediction. In many practical scenarios one must discretize the system, but even under such discretization, it is not possible to compute the associated expected value (integral) required for inference. Such quantities are then approximated by Monte Carlo methods, and the associated cost to achieve a given level of error in this context can substantially be reduced by using multilevel Monte Carlo (MLMC). MLMC relies upon exact sampling of the model of interest, which is not always possible. We devise a sequential Monte Carlo (SMC) method, which does not require exact sampling, to leverage the MLMC method. We prove that for some models with n data points, that to achieve a mean square error (MSE) in estimation of O(ɛ2) (for some 0 < ɛ < 1) our MLSMC method has a cost of O(n2ɛ−2) versus an SMC method that just approximates the most precise discretiztion of O(n2ɛ−3). This is illustrated on two numerical examples.
Original languageEnglish (US)
JournalInternational Journal for Uncertainty Quantification
Volume9
Issue number4
DOIs
StatePublished - Jan 1 2019
Externally publishedYes

Fingerprint

Dive into the research topics of 'A multilevel approach for sequential inference on partially observed deterministic systems'. Together they form a unique fingerprint.

Cite this