Dynamic Embedding on Textual Networks via a Gaussian Process

Pengyu Cheng, Yitong Li, Xinyuan Zhang, Liqun Cheng, David Carlson, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

22 Downloads (Pure)


Textual network embedding aims to learn low-dimensional representations of text-annotated nodes in a graph. Prior work in this area has typically focused on fixed graph structures; however, real-world networks are often dynamic. We address this challenge with a novel end-to-end node-embedding model, called Dynamic Embedding for Textual Networks with a Gaussian Process (DetGP). After training, DetGP can be applied efficiently to dynamic graphs without re-training or backpropagation. The learned representation of each node is a combination of textual and structural embeddings. Because the structure is allowed to be dynamic, our method uses the Gaussian process to take advantage of its non-parametric properties. To use both local and global graph structures, diffusion is used to model multiple hops between neighbors. The relative importance of global versus local structure for the embeddings is learned automatically. With the non-parametric nature of the Gaussian process, updating the embeddings for a changed graph structure requires only a forward pass through the learned model. Considering link prediction and node classification, experiments demonstrate the empirical effectiveness of our method compared to baseline approaches. We further show that DetGP can be straightforwardly and efficiently applied to dynamic textual networks.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Oct 5 2019
Externally publishedYes


  • cs.LG
  • cs.CL
  • cs.SI
  • stat.ML


Dive into the research topics of 'Dynamic Embedding on Textual Networks via a Gaussian Process'. Together they form a unique fingerprint.

Cite this