A Graph Embedding Technique for Weighted Graphs Based on LSTM Autoencoders
Citations

WEB OF SCIENCE

7
Citations

SCOPUS

8

초록

A graph is a data structure consisting of nodes and edges between these nodes. Graph embedding is to generatea low dimensional vector for a given graph that best represents the characteristics of the graph. Recently, therehave been studies on graph embedding, especially using deep learning techniques. However, until now, mostdeep learning-based graph embedding techniques have focused on unweighted graphs. Therefore, in this paper,we propose a graph embedding technique for weighted graphs based on long short-term memory (LSTM)autoencoders. Given weighted graphs, we traverse each graph to extract node-weight sequences from the graph. Each node-weight sequence represents a path in the graph consisting of nodes and the weights between thesenodes. We then train an LSTM autoencoder on the extracted node-weight sequences and encode each nodeweightsequence into a fixed-length vector using the trained LSTM autoencoder. Finally, for each graph, wecollect the encoding vectors obtained from the graph and combine them to generate the final embedding vectorfor the graph. These embedding vectors can be used to classify weighted graphs or to search for similarweighted graphs. The experiments on synthetic and real datasets show that the proposed method is effective inmeasuring the similarity between weighted graphs.

키워드

Graph EmbeddingGraph SimilarityLSTM AutoencoderWeighted Graph EmbeddingWeighted Graph
제목
A Graph Embedding Technique for Weighted Graphs Based on LSTM Autoencoders
저자
서민지이기용
DOI
10.3745/JIPS.04.0197
발행일
2020-12
저널명
JIPS(Journal of Information Processing Systems)
16
6
페이지
1407 ~ 1423