Scalable bayesian non-negative tensor factorization for massive count data

Changwei Hu, Piyush Rai, Changyou Chen, Matthew Harding, Lawrence Carin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

21 Scopus citations

Abstract

We present a Bayesian non-negative tensor factorization model for count-valued tensor data, and develop scalable inference algorithms (both batch and online) for dealing with massive tensors. Our generative model can handle overdispersed counts as well as infer the rank of the decomposition. Moreover, leveraging a reparameterization of the Poisson distribution as a multinomial facilitates conjugacy in the model and enables simple and efficient Gibbs sampling and variational Bayes (VB) inference updates, with a computational cost that only depends on the number of nonzeros in the tensor. The model also provides a nice interpretability for the factors; in our model, each factor corresponds to a “topic”. We develop a set of online inference algorithms that allow further scaling up the model to massive tensors, for which batch inference methods may be infeasible. We apply our framework on diverse real-world applications, such as multiway topic modeling on a scientific publications database, analyzing a political science data set, and analyzing a massive household transactions data set.
Original languageEnglish (US)
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer [email protected]
Pages53-70
Number of pages18
ISBN (Print)9783319235240
DOIs
StatePublished - Jan 1 2015
Externally publishedYes

Fingerprint

Dive into the research topics of 'Scalable bayesian non-negative tensor factorization for massive count data'. Together they form a unique fingerprint.

Cite this