Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities

Frank Nielsen, Ke Sun

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

Information-theoreticmeasures, such as the entropy, the cross-entropy and the Kullback-Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback-Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback-Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures.
Original languageEnglish (US)
Pages (from-to)442
JournalEntropy
Volume18
Issue number12
DOIs
StatePublished - Dec 9 2016

Fingerprint

Dive into the research topics of 'Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities'. Together they form a unique fingerprint.

Cite this