Enhancing both Local and Global Entity Linking Models with Attention

Jinliang Li, Haoyu Liu, Yulong Zhang, Li Zhang, Qiang Yang, Jianfeng Qu, Zhixu Li

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Entity linking aims at mapping the mentions in a document to their corresponding entities in a given knowledge base, which involves two continuous steps, i.e., local step which focuses on modeling the semantic meaning of the context around the mention, and global step which optimizes the refereed entities coherence in the document. Upon the existing great efforts on both steps, this paper would like to enhance both local and global entity linking models with several attention mechanisms respectively. Particularly, we propose to leverage self-attention mechanism and LSTM-based attention mechanism to better capture the inter-dependencies between tokens in the mention context for the local entity linking models. We also adopt a hierarchical attention network with a multi-head attention layer to better represent documents with one or multiple topics for the global entity linking models, which could help alleviate the side effect of error accumulation. Extensive empirical study on standard benchmarks proves the effectiveness of the proposed models.
Original languageEnglish (US)
Title of host publicationWeb Information Systems Engineering – WISE 2021
PublisherSpringer International Publishing
Pages290-304
Number of pages15
ISBN (Print)9783030908874
DOIs
StatePublished - Jan 1 2022

Fingerprint

Dive into the research topics of 'Enhancing both Local and Global Entity Linking Models with Attention'. Together they form a unique fingerprint.

Cite this