TY - GEN
T1 - Few-Shot Multi-Hop Relation Reasoning over Knowledge Bases
AU - Zhang, Chuxu
AU - Yu, Lu
AU - Saebi, Mandana
AU - Jiang, Meng
AU - Chawla, Nitesh
N1 - KAUST Repository Item: Exported on 2021-04-14
Acknowledgements: This work was supported in part by National Science Foundation grants CCI-1925607 and IIS1849816. We also thank the anonymous reviewers for their valuable comments and helpful suggestions.
PY - 2020
Y1 - 2020
N2 - Multi-hop relation reasoning over knowledge base is to generate effective and interpretable relation prediction through reasoning paths. The current methods usually require sufficient training data (fact triples) for each query relation, impairing their performances over few-shot relations (with limited triples) which are common in knowledge base. To this end, we propose FIRE, a novel few-shot multi-hop relation learning model. FIRE applies reinforcement learning to model the sequential steps of multi-hop reasoning, besides performs heterogeneous structure encoding and knowledge-aware search space pruning. The meta-learning technique is employed to optimize model parameters that could quickly adapt to few-shot relations. Empirical study on two datasets demonstrate that FIRE outperforms state-of-the-art methods
AB - Multi-hop relation reasoning over knowledge base is to generate effective and interpretable relation prediction through reasoning paths. The current methods usually require sufficient training data (fact triples) for each query relation, impairing their performances over few-shot relations (with limited triples) which are common in knowledge base. To this end, we propose FIRE, a novel few-shot multi-hop relation learning model. FIRE applies reinforcement learning to model the sequential steps of multi-hop reasoning, besides performs heterogeneous structure encoding and knowledge-aware search space pruning. The meta-learning technique is employed to optimize model parameters that could quickly adapt to few-shot relations. Empirical study on two datasets demonstrate that FIRE outperforms state-of-the-art methods
UR - http://hdl.handle.net/10754/667823
UR - https://www.aclweb.org/anthology/2020.findings-emnlp.51
U2 - 10.18653/v1/2020.findings-emnlp.51
DO - 10.18653/v1/2020.findings-emnlp.51
M3 - Conference contribution
BT - Findings of the Association for Computational Linguistics: EMNLP 2020
PB - Association for Computational Linguistics (ACL)
ER -