Dictionary learning for noisy and incomplete hyperspectral images

Zhengming Xing, Mingyuan Zhou, Alexey Castrodad, Guillermo Sapiro, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

90 Scopus citations


We consider analysis of noisy and incomplete hyperspectral imagery, with the objective of removing the noise and inferring the missing data. The noise statistics may be wavelength dependent, and the fraction of data missing (at random) may be substantial, including potentially entire bands, offering the potential to significantly reduce the quantity of data that need be measured. To achieve this objective, the imagery is divided into contiguous three-dimensional (3D) spatio-spectral blocks of spatial dimension much less than the image dimension. It is assumed that each such 3D block may be represented as a linear combination of dictionary elements of the same dimension, plus noise, and the dictionary elements are learned in situ based on the observed data (no a priori training). The number of dictionary elements needed for representation of any particular block is typically small relative to the block dimensions, and all the image blocks are processed jointly ("collaboratively") to infer the underlying dictionary. We address dictionary learning from a Bayesian perspective, considering two distinct means of imposing sparse dictionary usage. These models allow inference of the number of dictionary elements needed as well as the underlying wavelength-dependent noise statistics. It is demonstrated that drawing the dictionary elements from a Gaussian process prior, imposing structure on the wavelength dependence of the dictionary elements, yields significant advantages, relative to the more conventional approach of using an independent and identically distributed Gaussian prior for the dictionary elements; this advantage is particularly evident in the presence of noise. The framework is demonstrated by processing hyperspectral imagery with a significant number of voxels missing uniformly at random, with imagery at specific wavelengths missing entirely, and in the presence of substantial additive noise. © 2012 Society for Industrial and Applied Mathematics.
Original languageEnglish (US)
Pages (from-to)33-56
Number of pages24
JournalSIAM Journal on Imaging Sciences
Issue number1
StatePublished - Feb 13 2012
Externally publishedYes


Dive into the research topics of 'Dictionary learning for noisy and incomplete hyperspectral images'. Together they form a unique fingerprint.

Cite this