TY - JOUR
T1 - On the stability of sequential Monte Carlo methods in high dimensions
AU - Beskos, Alexandros
AU - Crisan, Dan
AU - Jasra, Ajay
N1 - Generated from Scopus record by KAUST IRTS on 2019-11-20
PY - 2014/1/1
Y1 - 2014/1/1
N2 - We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on Rd for large d. It is well known [Bengtsson, Bickel and Li, In Probability and Statistics: Essays in Honor of David A. Freedman, D. Nolan and T. Speed, eds. (2008) 316-334 IMS; see also Pushing the Limits of Contemporary Statistics (2008) 318-329 IMS, Mon. Weather Rev. (2009) 136 (2009) 4629-4640] that using a single importance sampling step, one produces an approximation for the target that deteriorates as the dimension d increases, unless the number of Monte Carlo samples N increases at an exponential rate in d. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a "simple" density and moving to the one of interest, using an SMC method to sample from the sequence; see, for example, Chopin [Biometrika 89 (2002) 539-551]; see also [J. R. Stat. Soc. Ser. B Stat. Methodol. 68 (2006) 411-436, Phys. Rev. Lett. 78 (1997) 2690-2693, Stat. Comput. 11 (2001) 125-139]. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable εN as d → ∞ with 1 < εN < N. The convergence is achieved with a computational cost proportional to Nd 2. If εN ≥ N, we can raise its value by introducing a number of resampling steps, say m (where m is independent of d). In this case, the ESS converges to a random variable εN,m as d → ∞ and limm→ε εN,m = N. Also, we show that the Monte Carlo error for estimating a fixed-dimensional marginal expectation is of order 1/N uniformly in d. The results imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed-dimensional marginals at a cost which is less than exponential in d and indicate that resampling leads to a reduction in the Monte Carlo error and increase in the ESS. All of our analysis is made under the assumption that the target density is i.i.d. © 2014 Institute of Mathematical Statistics.
AB - We investigate the stability of a Sequential Monte Carlo (SMC) method applied to the problem of sampling from a target distribution on Rd for large d. It is well known [Bengtsson, Bickel and Li, In Probability and Statistics: Essays in Honor of David A. Freedman, D. Nolan and T. Speed, eds. (2008) 316-334 IMS; see also Pushing the Limits of Contemporary Statistics (2008) 318-329 IMS, Mon. Weather Rev. (2009) 136 (2009) 4629-4640] that using a single importance sampling step, one produces an approximation for the target that deteriorates as the dimension d increases, unless the number of Monte Carlo samples N increases at an exponential rate in d. We show that this degeneracy can be avoided by introducing a sequence of artificial targets, starting from a "simple" density and moving to the one of interest, using an SMC method to sample from the sequence; see, for example, Chopin [Biometrika 89 (2002) 539-551]; see also [J. R. Stat. Soc. Ser. B Stat. Methodol. 68 (2006) 411-436, Phys. Rev. Lett. 78 (1997) 2690-2693, Stat. Comput. 11 (2001) 125-139]. Using this class of SMC methods with a fixed number of samples, one can produce an approximation for which the effective sample size (ESS) converges to a random variable εN as d → ∞ with 1 < εN < N. The convergence is achieved with a computational cost proportional to Nd 2. If εN ≥ N, we can raise its value by introducing a number of resampling steps, say m (where m is independent of d). In this case, the ESS converges to a random variable εN,m as d → ∞ and limm→ε εN,m = N. Also, we show that the Monte Carlo error for estimating a fixed-dimensional marginal expectation is of order 1/N uniformly in d. The results imply that, in high dimensions, SMC algorithms can efficiently control the variability of the importance sampling weights and estimate fixed-dimensional marginals at a cost which is less than exponential in d and indicate that resampling leads to a reduction in the Monte Carlo error and increase in the ESS. All of our analysis is made under the assumption that the target density is i.i.d. © 2014 Institute of Mathematical Statistics.
UR - http://projecteuclid.org/euclid.aoap/1400073653
UR - http://www.scopus.com/inward/record.url?scp=84901774280&partnerID=8YFLogxK
U2 - 10.1214/13-AAP951
DO - 10.1214/13-AAP951
M3 - Article
SN - 1050-5164
VL - 24
JO - Annals of Applied Probability
JF - Annals of Applied Probability
IS - 4
ER -