TY - JOUR
T1 - Annealing evolutionary stochastic approximation Monte Carlo for global optimization
AU - Liang, Faming
N1 - KAUST Repository Item: Exported on 2020-10-01
Acknowledged KAUST grant number(s): KUS-C1-016-04
Acknowledgements: The author's research was supported in part by the grant (DMS-0607755) made by the National Science Foundation and the award (KUS-C1-016-04) made by King Abdullah University of Science and Technology (KAUST). The author thanks the editor, the associate editor and the referees for their comments which have led to significant improvement of this paper.
This publication acknowledges KAUST support, but has no KAUST affiliated authors.
PY - 2010/4/8
Y1 - 2010/4/8
N2 - In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.
AB - In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.
UR - http://hdl.handle.net/10754/597576
UR - http://link.springer.com/10.1007/s11222-010-9176-1
UR - http://www.scopus.com/inward/record.url?scp=79958195868&partnerID=8YFLogxK
U2 - 10.1007/s11222-010-9176-1
DO - 10.1007/s11222-010-9176-1
M3 - Article
SN - 0960-3174
VL - 21
SP - 375
EP - 393
JO - Statistics and Computing
JF - Statistics and Computing
IS - 3
ER -