Improving Adversarial Text Generation by Modeling the Distant Future

Ruiyi Zhang*, Changyou Chen, Zhe Gan, Wenlin Wang, Dinghan Shen, Guoyin Wang, Zheng Wen, Lawrence Carin

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

10 Scopus citations

Abstract

Auto-regressive text generation models usually focus on local fluency, and may cause inconsistent semantic meaning in long text generation. Further, automatically generating words with similar semantics is challenging, and hand-crafted linguistic rules are difficult to apply. We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues. Specifically, we propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization. Extensive experiments demonstrate that the proposed method leads to improved performance.

Original languageEnglish
Title of host publication58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020)
PublisherASSOC COMPUTATIONAL LINGUISTICS-ACL
Pages2516-2531
Number of pages16
StatePublished - 2020
Externally publishedYes
Event58th Annual Meeting of the Association-for-Computational-Linguistics (ACL) -
Duration: Jul 5 2020Jul 10 2020

Conference

Conference58th Annual Meeting of the Association-for-Computational-Linguistics (ACL)
Period07/5/2007/10/20

Fingerprint

Dive into the research topics of 'Improving Adversarial Text Generation by Modeling the Distant Future'. Together they form a unique fingerprint.

Cite this