TY - GEN
T1 - Multi-task classification with infinite local experts
AU - Wang, Chunping
AU - An, Qi
AU - Carin, Lawrence
AU - Dunson, David B.
N1 - Generated from Scopus record by KAUST IRTS on 2021-02-09
PY - 2009/9/23
Y1 - 2009/9/23
N2 - We propose a multi-task learning (MTL) framework for nonlinear classification, based on an infinite set of local experts in feature space. The usage of local experts enables sharing at the expert-level, encouraging the borrowing of information even if tasks are similar only in subregions of feature space. A kernel stick-breaking process (KSBP) prior is imposed on the underlying distribution of class labels, so that the number of experts is inferred in the posterior and thus model selection issues are avoided. The MTL is implemented by imposing a Dirichlet process (DP) prior on a layer above the task- dependent KSBPs. ©2009 IEEE.
AB - We propose a multi-task learning (MTL) framework for nonlinear classification, based on an infinite set of local experts in feature space. The usage of local experts enables sharing at the expert-level, encouraging the borrowing of information even if tasks are similar only in subregions of feature space. A kernel stick-breaking process (KSBP) prior is imposed on the underlying distribution of class labels, so that the number of experts is inferred in the posterior and thus model selection issues are avoided. The MTL is implemented by imposing a Dirichlet process (DP) prior on a layer above the task- dependent KSBPs. ©2009 IEEE.
UR - http://ieeexplore.ieee.org/document/4959897/
UR - http://www.scopus.com/inward/record.url?scp=70349192873&partnerID=8YFLogxK
U2 - 10.1109/ICASSP.2009.4959897
DO - 10.1109/ICASSP.2009.4959897
M3 - Conference contribution
SN - 9781424423545
SP - 1569
EP - 1572
BT - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
ER -