Multi-task classification with infinite local experts

Chunping Wang, Qi An, Lawrence Carin, David B. Dunson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

We propose a multi-task learning (MTL) framework for nonlinear classification, based on an infinite set of local experts in feature space. The usage of local experts enables sharing at the expert-level, encouraging the borrowing of information even if tasks are similar only in subregions of feature space. A kernel stick-breaking process (KSBP) prior is imposed on the underlying distribution of class labels, so that the number of experts is inferred in the posterior and thus model selection issues are avoided. The MTL is implemented by imposing a Dirichlet process (DP) prior on a layer above the task- dependent KSBPs. ©2009 IEEE.
Original languageEnglish (US)
Title of host publicationICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Pages1569-1572
Number of pages4
DOIs
StatePublished - Sep 23 2009
Externally publishedYes

Fingerprint

Dive into the research topics of 'Multi-task classification with infinite local experts'. Together they form a unique fingerprint.

Cite this