상세 보기
- 서민지;
- 이기용
WEB OF SCIENCE
7SCOPUS
8초록
A graph is a data structure consisting of nodes and edges between these nodes. Graph embedding is to generatea low dimensional vector for a given graph that best represents the characteristics of the graph. Recently, therehave been studies on graph embedding, especially using deep learning techniques. However, until now, mostdeep learning-based graph embedding techniques have focused on unweighted graphs. Therefore, in this paper,we propose a graph embedding technique for weighted graphs based on long short-term memory (LSTM)autoencoders. Given weighted graphs, we traverse each graph to extract node-weight sequences from the graph. Each node-weight sequence represents a path in the graph consisting of nodes and the weights between thesenodes. We then train an LSTM autoencoder on the extracted node-weight sequences and encode each nodeweightsequence into a fixed-length vector using the trained LSTM autoencoder. Finally, for each graph, wecollect the encoding vectors obtained from the graph and combine them to generate the final embedding vectorfor the graph. These embedding vectors can be used to classify weighted graphs or to search for similarweighted graphs. The experiments on synthetic and real datasets show that the proposed method is effective inmeasuring the similarity between weighted graphs.
키워드
- 제목
- A Graph Embedding Technique for Weighted Graphs Based on LSTM Autoencoders
- 저자
- 서민지; 이기용
- 발행일
- 2020-12
- 권
- 16
- 호
- 6
- 페이지
- 1407 ~ 1423