Adversarial Self-Paced Learning for Mixture Models of Hawkes Processes

Dixin Luo, Hongteng Xu, Lawrence Carin

Research output: Contribution to journalArticlepeer-review

14 Downloads (Pure)

Abstract

We propose a novel adversarial learning strategy for mixture models of Hawkes processes, leveraging data augmentation techniques of Hawkes process in the framework of self-paced learning. Instead of learning a mixture model directly from a set of event sequences drawn from different Hawkes processes, the proposed method learns the target model iteratively, which generates "easy" sequences and uses them in an adversarial and self-paced manner. In each iteration, we first generate a set of augmented sequences from original observed sequences. Based on the fact that an easy sample of the target model can be an adversarial sample of a misspecified model, we apply a maximum likelihood estimation with an adversarial self-paced mechanism. In this manner the target model is updated, and the augmented sequences that obey it are employed for the next learning iteration. Experimental results show that the proposed method outperforms traditional methods consistently.
Original languageEnglish (US)
JournalArxiv preprint
StatePublished - Jun 20 2019
Externally publishedYes

Keywords

  • stat.ML
  • cs.LG

Fingerprint

Dive into the research topics of 'Adversarial Self-Paced Learning for Mixture Models of Hawkes Processes'. Together they form a unique fingerprint.

Cite this