Learning context-aware convolutional filters for text processing

Dinghan Shen, Martin Renqiang Min, Yitong Li, Lawrence Carin

Research output: Chapter in Book/Report/Conference proceedingConference contribution

15 Scopus citations


Convolutional neural networks (CNNs) have recently emerged as a popular building block for natural language processing (NLP). Despite their success, most existing CNN models employed in NLP share the same learned (and static) set of filters for all input sentences. In this paper, we consider an approach of using a small meta network to learn context-aware convolutional filters for text processing. The role of meta network is to abstract the contextual information of a sentence or document into a set of input-aware filters. We further generalize this framework to model sentence pairs, where a bidirectional filter generation mechanism is introduced to encapsulate co-dependent sentence representations. In our benchmarks on four different tasks, including ontology classification, sentiment analysis, answer sentence selection, and paraphrase identification, our proposed model, a modified CNN with context-aware filters, consistently outperforms the standard CNN and attention-based CNN baselines. By visualizing the learned context-aware filters, we further validate and rationalize the effectiveness of proposed framework.
Original languageEnglish (US)
Title of host publicationProceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, EMNLP 2018
PublisherAssociation for Computational Linguistics
Number of pages10
ISBN (Print)9781948087841
StatePublished - Jan 1 2020
Externally publishedYes


Dive into the research topics of 'Learning context-aware convolutional filters for text processing'. Together they form a unique fingerprint.

Cite this