TY - JOUR
T1 - Hierarchical-block conditioning approximations for high-dimensional multivariate normal probabilities
AU - Cao, Jian
AU - Genton, Marc G.
AU - Keyes, David E.
AU - Turkiyyah, George
N1 - KAUST Repository Item: Exported on 2021-02-19
Acknowledgements: This research was supported by King Abdullah University of Science and Technology (KAUST).
PY - 2018/7/30
Y1 - 2018/7/30
N2 - This paper presents a new method to estimate large-scale multivariate normal probabilities. The approach combines a hierarchical representation with processing of the covariance matrix that decomposes the n-dimensional problem into a sequence of smaller m-dimensional ones. It also includes a d-dimensional conditioning method that further decomposes the m-dimensional problems into smaller d-dimensional problems. The resulting two-level hierarchical-block conditioning method requires Monte Carlo simulations to be performed only in d dimensions, with d≪n, and allows the complexity of the algorithm’s major cost to be O(nlogn). The run-time cost of the method depends on two parameters, m and d, where m represents the diagonal block size and controls the sizes of the blocks of the covariance matrix that are replaced by low-rank approximations, and d allows a trade-off of accuracy for expensive computations in the evaluation of the probabilities of m-dimensional blocks. We also introduce an inexpensive block reordering strategy to provide improved accuracy in the overall probability computation. The downside of this method, as with other such conditioning approximations, is the absence of an internal estimate of its error to use in tuning the approximation. Numerical simulations on problems from 2D spatial statistics with dimensions up to 16,384 indicate that the algorithm achieves a 1% error level and improves the run time over a one-level hierarchical Quasi-Monte Carlo method by a factor between 10 and 15.
AB - This paper presents a new method to estimate large-scale multivariate normal probabilities. The approach combines a hierarchical representation with processing of the covariance matrix that decomposes the n-dimensional problem into a sequence of smaller m-dimensional ones. It also includes a d-dimensional conditioning method that further decomposes the m-dimensional problems into smaller d-dimensional problems. The resulting two-level hierarchical-block conditioning method requires Monte Carlo simulations to be performed only in d dimensions, with d≪n, and allows the complexity of the algorithm’s major cost to be O(nlogn). The run-time cost of the method depends on two parameters, m and d, where m represents the diagonal block size and controls the sizes of the blocks of the covariance matrix that are replaced by low-rank approximations, and d allows a trade-off of accuracy for expensive computations in the evaluation of the probabilities of m-dimensional blocks. We also introduce an inexpensive block reordering strategy to provide improved accuracy in the overall probability computation. The downside of this method, as with other such conditioning approximations, is the absence of an internal estimate of its error to use in tuning the approximation. Numerical simulations on problems from 2D spatial statistics with dimensions up to 16,384 indicate that the algorithm achieves a 1% error level and improves the run time over a one-level hierarchical Quasi-Monte Carlo method by a factor between 10 and 15.
UR - http://hdl.handle.net/10754/630440
UR - https://link.springer.com/article/10.1007%2Fs11222-018-9825-3
UR - http://www.scopus.com/inward/record.url?scp=85050991526&partnerID=8YFLogxK
U2 - 10.1007/s11222-018-9825-3
DO - 10.1007/s11222-018-9825-3
M3 - Article
AN - SCOPUS:85050991526
SN - 0960-3174
VL - 29
SP - 585
EP - 598
JO - Statistics and Computing
JF - Statistics and Computing
IS - 3
ER -