Does Head Label Help for Long-Tailed Multi-Label Text Classification

Lin Xiao, Xiangliang Zhang, Liping Jing, Chi Huang, Mingyang Song

Research output: Chapter in Book/Report/Conference proceedingConference contribution

34 Scopus citations

Abstract

Multi-label text classification (MLTC) aims to annotate documents with the most relevant labels from a number of candidate labels. In real applications, the distribution of label frequency often exhibits a long tail, i.e., a few labels are associated with a large number of documents (a.k.a. head labels), while a large fraction of labels are associated with a small number of documents (a.k.a. tail labels). To address the challenge of insufficient training data on tail label classification, we propose a Head-to-Tail Network (HTTN) to transfer the meta-knowledge from the data-rich head labels to data-poor tail labels. The meta-knowledge is the mapping from few-shot network parameters to many-shot network parameters, which aims to promote the generalizability of tail classifiers. Extensive experimental results on three benchmark datasets demonstrate that HTTN consistently outperforms the state-of-the-art methods. The code and hyper-parameter settings are released for reproducibility(1).
Original languageEnglish (US)
Title of host publication35th AAAI Conference on Artificial Intelligence / 33rd Conference on Innovative Applications of Artificial Intelligence / 11th Symposium on Educational Advances in Artificial Intelligence
PublisherarXiv
Pages14103-14111
Number of pages9
StatePublished - 2021

Fingerprint

Dive into the research topics of 'Does Head Label Help for Long-Tailed Multi-Label Text Classification'. Together they form a unique fingerprint.

Cite this