TY - GEN
T1 - Autoencoder-based Node Embedding Regeneration and Prediction in Dynamic Graphs
AU - Darghouthi, Mohamed
AU - Ghazzai, Hakim
AU - Sboui, Lokman
AU - Massoud, Yehia Mahmoud
N1 - KAUST Repository Item: Exported on 2023-01-03
PY - 2022/11/16
Y1 - 2022/11/16
N2 - In this paper, we introduce a neural network model capable of generating embeddings for graph nodes using just their initial features and without any prior knowledge about their neighbors. To this end, we start with generating embeddings from a model trained on the entire graph knowledge (node features, node neighbors, and random walks). We then build a deep autoencoder model that learns to map node features to embeddings similar to those generated by the model trained on the entire graph knowledge. By using a tuned loss function that acts as a strong regularization for the autoencoder, our model is designed to generate embeddings with minimum error to those generated from the first model. It learns how to generate embeddings that reflect the positioning of the nodes in the graph even though it only uses nodes' initial features. We evaluate the model on the amazon and Alibaba graph datasets by computing the mean squared error, cosine similarity, and mean average percentage error between the embeddings generated by our model and the ones we used as reference. We also evaluate the ability to reconstruct the initial graph by repredicting the neighbors of each node using cosine similarity between nodes generated embedding and its neighbors' reference embeddings. Results have shown an efficiency higher than 80 % in graph generation and neighborhood detection.
AB - In this paper, we introduce a neural network model capable of generating embeddings for graph nodes using just their initial features and without any prior knowledge about their neighbors. To this end, we start with generating embeddings from a model trained on the entire graph knowledge (node features, node neighbors, and random walks). We then build a deep autoencoder model that learns to map node features to embeddings similar to those generated by the model trained on the entire graph knowledge. By using a tuned loss function that acts as a strong regularization for the autoencoder, our model is designed to generate embeddings with minimum error to those generated from the first model. It learns how to generate embeddings that reflect the positioning of the nodes in the graph even though it only uses nodes' initial features. We evaluate the model on the amazon and Alibaba graph datasets by computing the mean squared error, cosine similarity, and mean average percentage error between the embeddings generated by our model and the ones we used as reference. We also evaluate the ability to reconstruct the initial graph by repredicting the neighbors of each node using cosine similarity between nodes generated embedding and its neighbors' reference embeddings. Results have shown an efficiency higher than 80 % in graph generation and neighborhood detection.
UR - http://hdl.handle.net/10754/686736
UR - https://ieeexplore.ieee.org/document/9988220/
U2 - 10.1109/iceccme55909.2022.9988220
DO - 10.1109/iceccme55909.2022.9988220
M3 - Conference contribution
SN - 978-1-6654-7096-4
BT - 2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)
PB - IEEE
ER -