Annealing evolutionary stochastic approximation Monte Carlo for global optimization

Faming Liang

Research output: Contribution to journalArticlepeer-review

14 Scopus citations


In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.
Original languageEnglish (US)
Pages (from-to)375-393
Number of pages19
JournalStatistics and Computing
Issue number3
StatePublished - Apr 8 2010
Externally publishedYes


Dive into the research topics of 'Annealing evolutionary stochastic approximation Monte Carlo for global optimization'. Together they form a unique fingerprint.

Cite this