TY - JOUR
T1 - Exploring attention mechanism for graph similarity learning
AU - Tan, Wenhui
AU - Gao, Xin
AU - Li, Yiyang
AU - Wen, Guangqi
AU - Cao, Peng
AU - Yang, Jinzhu
AU - Li, Weiping
AU - Zaiane, Osmar R.
N1 - Generated from Scopus record by KAUST IRTS on 2023-09-21
PY - 2023/9/27
Y1 - 2023/9/27
N2 - Graph similarity estimation is a challenging task due to the complex graph structure. Though important and well-studied, three key aspects are yet to be fully handled in a unified framework: (i) how to exploit the node embedding by leveraging both local spatial neighborhood information and the global context, (ii) how to effectively learn richer cross graph interactions from a pairwise node perspective and (iii) how to map the similarity matrix into a similarity score by exploiting the inherent structure in the similarity matrix. To solve these issues, we explore multiple attention mechanisms for graph similarity learning in this work. More specifically, we propose a unified graph similarity learning framework involving (i) a hybrid of graph convolution and graph self-attention for node embedding learning, (ii) a cross graph co-attention (GCA) module for graph interaction modeling, (iii) similarity-wise self-attention (SSA) module for graph similarity matrix alignment and (iv) graph similarity matrix learning for predicting the similarity scores. Extensive experimental results on three challenging benchmarks including LINUX, AIDS, and IMDBMulti demonstrate that the proposed NA-GSL performs favorably against state-of-the-art graph similarity estimation methods. The code is available at https://github.com/AlbertTan404/NA-GSL.
AB - Graph similarity estimation is a challenging task due to the complex graph structure. Though important and well-studied, three key aspects are yet to be fully handled in a unified framework: (i) how to exploit the node embedding by leveraging both local spatial neighborhood information and the global context, (ii) how to effectively learn richer cross graph interactions from a pairwise node perspective and (iii) how to map the similarity matrix into a similarity score by exploiting the inherent structure in the similarity matrix. To solve these issues, we explore multiple attention mechanisms for graph similarity learning in this work. More specifically, we propose a unified graph similarity learning framework involving (i) a hybrid of graph convolution and graph self-attention for node embedding learning, (ii) a cross graph co-attention (GCA) module for graph interaction modeling, (iii) similarity-wise self-attention (SSA) module for graph similarity matrix alignment and (iv) graph similarity matrix learning for predicting the similarity scores. Extensive experimental results on three challenging benchmarks including LINUX, AIDS, and IMDBMulti demonstrate that the proposed NA-GSL performs favorably against state-of-the-art graph similarity estimation methods. The code is available at https://github.com/AlbertTan404/NA-GSL.
UR - https://linkinghub.elsevier.com/retrieve/pii/S0950705123004896
UR - http://www.scopus.com/inward/record.url?scp=85164685701&partnerID=8YFLogxK
U2 - 10.1016/j.knosys.2023.110739
DO - 10.1016/j.knosys.2023.110739
M3 - Article
SN - 0950-7051
VL - 276
JO - Knowledge-Based Systems
JF - Knowledge-Based Systems
ER -