TY - JOUR
T1 - Hierarchical Infinite Divisibility for Multiscale Shrinkage
AU - Yuan, Xin
AU - Rao, Vinayak
AU - Han, Shaobo
AU - Carin, Lawrence
N1 - Generated from Scopus record by KAUST IRTS on 2021-02-09
PY - 2014/9/1
Y1 - 2014/9/1
N2 - A new shrinkage-based construction is developed for a compressible vector x e ℝn, for cases in which the components of are naturally associated with a tree structure. Important examples are when corresponds to the coefficients of a wavelet or block-DCT representation of data. The method we consider in detail, and for which numerical results are presented, is based on the gamma distribution. The gamma distribution is a heavy-Tailed distribution that is infinitely divisible, and these characteristics are leveraged within the model. We further demonstrate that the general framework is appropriate for many other types of infinitely divisible heavy-Tailed distributions. Bayesian inference is carried out by approximating the posterior with samples from an MCMC algorithm, as well as by constructing a variational approximation to the posterior.We also consider expectation-maximization (EM) for a MAP (point) solution. State-of-The-Art results are manifested for compressive sensing and denoising applications, the latter with spiky (non-Gaussian) noise.
AB - A new shrinkage-based construction is developed for a compressible vector x e ℝn, for cases in which the components of are naturally associated with a tree structure. Important examples are when corresponds to the coefficients of a wavelet or block-DCT representation of data. The method we consider in detail, and for which numerical results are presented, is based on the gamma distribution. The gamma distribution is a heavy-Tailed distribution that is infinitely divisible, and these characteristics are leveraged within the model. We further demonstrate that the general framework is appropriate for many other types of infinitely divisible heavy-Tailed distributions. Bayesian inference is carried out by approximating the posterior with samples from an MCMC algorithm, as well as by constructing a variational approximation to the posterior.We also consider expectation-maximization (EM) for a MAP (point) solution. State-of-The-Art results are manifested for compressive sensing and denoising applications, the latter with spiky (non-Gaussian) noise.
UR - http://ieeexplore.ieee.org/document/6847180/
UR - http://www.scopus.com/inward/record.url?scp=84939494755&partnerID=8YFLogxK
U2 - 10.1109/TSP.2014.2334557
DO - 10.1109/TSP.2014.2334557
M3 - Article
SN - 1053-587X
VL - 62
SP - 4363
EP - 4374
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 17
ER -