TY - GEN
T1 - Enhancing both Local and Global Entity Linking Models with Attention
AU - Li, Jinliang
AU - Liu, Haoyu
AU - Zhang, Yulong
AU - Zhang, Li
AU - Yang, Qiang
AU - Qu, Jianfeng
AU - Li, Zhixu
N1 - KAUST Repository Item: Exported on 2022-01-11
Acknowledgements: This work was supported by the National Key R&D Program of China (No. 2018AAA0101900), the Priority Academic Program Development of Jiangsu Higher Education Institutions, National Natural Science Foundation of China (Grant No. 62072323, 61632016, 62102276), Natural Science Foundation of Jiangsu Province (No. BK20191420).
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Entity linking aims at mapping the mentions in a document to their corresponding entities in a given knowledge base, which involves two continuous steps, i.e., local step which focuses on modeling the semantic meaning of the context around the mention, and global step which optimizes the refereed entities coherence in the document. Upon the existing great efforts on both steps, this paper would like to enhance both local and global entity linking models with several attention mechanisms respectively. Particularly, we propose to leverage self-attention mechanism and LSTM-based attention mechanism to better capture the inter-dependencies between tokens in the mention context for the local entity linking models. We also adopt a hierarchical attention network with a multi-head attention layer to better represent documents with one or multiple topics for the global entity linking models, which could help alleviate the side effect of error accumulation. Extensive empirical study on standard benchmarks proves the effectiveness of the proposed models.
AB - Entity linking aims at mapping the mentions in a document to their corresponding entities in a given knowledge base, which involves two continuous steps, i.e., local step which focuses on modeling the semantic meaning of the context around the mention, and global step which optimizes the refereed entities coherence in the document. Upon the existing great efforts on both steps, this paper would like to enhance both local and global entity linking models with several attention mechanisms respectively. Particularly, we propose to leverage self-attention mechanism and LSTM-based attention mechanism to better capture the inter-dependencies between tokens in the mention context for the local entity linking models. We also adopt a hierarchical attention network with a multi-head attention layer to better represent documents with one or multiple topics for the global entity linking models, which could help alleviate the side effect of error accumulation. Extensive empirical study on standard benchmarks proves the effectiveness of the proposed models.
UR - http://hdl.handle.net/10754/674903
UR - https://link.springer.com/10.1007/978-3-030-90888-1_23
UR - http://www.scopus.com/inward/record.url?scp=85121935825&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-90888-1_23
DO - 10.1007/978-3-030-90888-1_23
M3 - Conference contribution
SN - 9783030908874
SP - 290
EP - 304
BT - Web Information Systems Engineering – WISE 2021
PB - Springer International Publishing
ER -