Well-conditioned Spectral Transforms for Dynamic Graph Representation

Bingxin Zhou, Xinliang Liu, Yuehua Liu, Yunying Huang, Pietro Liò, Yu Guang Wang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This work establishes a fully-spectral framework to capture informative long-range temporal interactions in a dynamic system. We connect the spectral transform to the low-rank self-attention mechanisms and investigate its energy-balancing effect and computational efficiency. Based on the observations, we leverage the adaptive power method SVD and global graph framelet convolution to encode time-dependent features and graph structure for continuous-time dynamic graph representation learning. The former serves as an efficient high-order linear self-attention with determined propagation rules, and the latter establishes scalable and transferable geometric characterization for property prediction. Empirically, the proposed model learns well-conditioned hidden representations on a variety of online learning tasks, and it achieves top performance with a reduced number of learnable parameters and faster propagation speed.
Original languageEnglish (US)
Title of host publication1st Learning on Graphs Conference, LOG 2022
PublisherML Research Press
StatePublished - Jan 1 2022

Fingerprint

Dive into the research topics of 'Well-conditioned Spectral Transforms for Dynamic Graph Representation'. Together they form a unique fingerprint.

Cite this