TY - JOUR
T1 - Machine learning-based conditional mean filter: A generalization of the ensemble Kalman filter for nonlinear data assimilation
AU - Hoang, Truong-Vinh
AU - Krumscheid, Sebastian
AU - Matthies, Hermann G.
AU - Tempone, Raul
N1 - KAUST Repository Item: Exported on 2022-09-14
Acknowledged KAUST grant number(s): URF/1/2281-01-01, URF/1/2584-01-01
Acknowledgements: This publication was supported by funding from the Alexander von Humboldt Foundation and King Abdullah University of Science and Technology (KAUST) Office of Sponsored Research (OSR) under award numbers URF/1/2281-01-01 and URF/1/2584-01-01 in the KAUST Competitive Research Grants Programs, respectively. R. Tempone is a member of the KAUST SRI Center for Uncertainty Quantification in Computational Science and Engineering. Simulations were performed with computing resources granted by RWTH Aachen University under project rwth0632.
PY - 2022/8/1
Y1 - 2022/8/1
N2 - This paper presents the machine learning-based ensemble conditional mean filter (ML-EnCMF) — a filtering method based on the conditional mean filter (CMF) previously introduced in the literature. The updated mean of the CMF matches that of the posterior, obtained by applying Bayes' rule on the filter's forecast distribution. Moreover, we show that the CMF's updated covariance coincides with the expected conditional covariance. Implementing the EnCMF requires computing the conditional mean (CM). A likelihood-based estimator is prone to significant errors for small ensemble sizes, causing the filter divergence. We develop a systematical methodology for integrating machine learning into the EnCMF based on the CM's orthogonal projection property. First, we use a combination of an artificial neural network (ANN) and a linear function, obtained based on the ensemble Kalman filter (EnKF), to approximate the CM, enabling the ML-EnCMF to inherit EnKF's advantages. Secondly, we apply a suitable variance reduction technique to reduce statistical errors when estimating loss function. Lastly, we propose a model selection procedure for element-wisely selecting the applied filter, i.e., either the EnKF or ML-EnCMF, at each updating step. We demonstrate the ML-EnCMF performance using the Lorenz-63 and Lorenz-96 systems and show that the ML-EnCMF outperforms the EnKF and the likelihood-based EnCMF.
AB - This paper presents the machine learning-based ensemble conditional mean filter (ML-EnCMF) — a filtering method based on the conditional mean filter (CMF) previously introduced in the literature. The updated mean of the CMF matches that of the posterior, obtained by applying Bayes' rule on the filter's forecast distribution. Moreover, we show that the CMF's updated covariance coincides with the expected conditional covariance. Implementing the EnCMF requires computing the conditional mean (CM). A likelihood-based estimator is prone to significant errors for small ensemble sizes, causing the filter divergence. We develop a systematical methodology for integrating machine learning into the EnCMF based on the CM's orthogonal projection property. First, we use a combination of an artificial neural network (ANN) and a linear function, obtained based on the ensemble Kalman filter (EnKF), to approximate the CM, enabling the ML-EnCMF to inherit EnKF's advantages. Secondly, we apply a suitable variance reduction technique to reduce statistical errors when estimating loss function. Lastly, we propose a model selection procedure for element-wisely selecting the applied filter, i.e., either the EnKF or ML-EnCMF, at each updating step. We demonstrate the ML-EnCMF performance using the Lorenz-63 and Lorenz-96 systems and show that the ML-EnCMF outperforms the EnKF and the likelihood-based EnCMF.
UR - http://hdl.handle.net/10754/669687
UR - https://www.aimsciences.org/article/doi/10.3934/fods.2022016
U2 - 10.3934/fods.2022016
DO - 10.3934/fods.2022016
M3 - Article
SN - 2639-8001
JO - Foundations of Data Science
JF - Foundations of Data Science
ER -