Graph-Driven Generative Models for Heterogeneous Multi-Task Learning

Wenlin Wang, Hongteng Xu, Zhe Gan, Bai Li, Guoyin Wang, Liqun Chen, Qian Yang, Wenqi Wang, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

11 Downloads (Pure)

Abstract

We propose a novel graph-driven generative model, that unifies multiple heterogeneous learning tasks into the same framework. The proposed model is based on the fact that heterogeneous learning tasks, which correspond to different generative processes, often rely on data with a shared graph structure. Accordingly, our model combines a graph convolutional network (GCN) with multiple variational autoencoders, thus embedding the nodes of the graph i.e., samples for the tasks) in a uniform manner while specializing their organization and usage to different tasks. With a focus on healthcare applications (tasks), including clinical topic modeling, procedure recommendation and admission-type prediction, we demonstrate that our method successfully leverages information across different tasks, boosting performance in all tasks and outperforming existing state-of-the-art approaches.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Nov 20 2019
Externally publishedYes

Keywords

  • cs.LG
  • stat.ML

Fingerprint

Dive into the research topics of 'Graph-Driven Generative Models for Heterogeneous Multi-Task Learning'. Together they form a unique fingerprint.

Cite this