TY - GEN
T1 - Decentralized Embedding Framework for Large-Scale Networks
AU - Imran, Mubashir
AU - Yin, Hongzhi
AU - Chen, Tong
AU - Shao, Yingxia
AU - Zhang, Xiangliang
AU - Zhou, Xiaofang
N1 - KAUST Repository Item: Exported on 2020-10-16
Acknowledgements: This work is supported by Australian Research Council (Grant No. DP190101985, DP170103954) and National Natural Science Foundation of China (Grant No. U1936104 and 61702015).
PY - 2020/9/21
Y1 - 2020/9/21
N2 - Network embedding aims to learn vector representations of vertices, that preserve both network structures and properties. However, most existing embedding methods fail to scale to large networks. A few frameworks have been proposed by extending existing methods to cope with network embedding on large-scale networks. These frameworks update the global parameters iteratively or compress the network while learning vector representation. Such network embedding schemes inevitably lead to a high cost of either high communication overhead or sub-optimal embedding quality. In this paper, we propose a novel decentralized large-scale network embedding framework called DeLNE. As the name suggests, DeLNE divides a network into smaller partitions and learn vector representation in a distributed fashion, avoiding any unnecessary communication overhead. Our proposed framework uses Variational Graph Convolution Auto-Encoders to embed the structure and properties of each sub-network. Secondly, we propose an embedding aggregation mechanism, that captures the global properties of each node. Thirdly, we propose an alignment function, that reconciles all sub-networks embedding into the same vector space. Due to the parallel nature of DeLNE, it scales well on large clustered environments. Through extensive experimentation on realistic datasets, we show that DeLNE produces high-quality embedding and outperforms existing large-scale network embeddings frameworks, in terms of both efficiency and effectiveness.
AB - Network embedding aims to learn vector representations of vertices, that preserve both network structures and properties. However, most existing embedding methods fail to scale to large networks. A few frameworks have been proposed by extending existing methods to cope with network embedding on large-scale networks. These frameworks update the global parameters iteratively or compress the network while learning vector representation. Such network embedding schemes inevitably lead to a high cost of either high communication overhead or sub-optimal embedding quality. In this paper, we propose a novel decentralized large-scale network embedding framework called DeLNE. As the name suggests, DeLNE divides a network into smaller partitions and learn vector representation in a distributed fashion, avoiding any unnecessary communication overhead. Our proposed framework uses Variational Graph Convolution Auto-Encoders to embed the structure and properties of each sub-network. Secondly, we propose an embedding aggregation mechanism, that captures the global properties of each node. Thirdly, we propose an alignment function, that reconciles all sub-networks embedding into the same vector space. Due to the parallel nature of DeLNE, it scales well on large clustered environments. Through extensive experimentation on realistic datasets, we show that DeLNE produces high-quality embedding and outperforms existing large-scale network embeddings frameworks, in terms of both efficiency and effectiveness.
UR - http://hdl.handle.net/10754/665603
UR - http://link.springer.com/10.1007/978-3-030-59419-0_26
UR - http://www.scopus.com/inward/record.url?scp=85092083445&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-59419-0_26
DO - 10.1007/978-3-030-59419-0_26
M3 - Conference contribution
SN - 9783030594183
SP - 425
EP - 441
BT - Database Systems for Advanced Applications
PB - Springer International Publishing
ER -