상세 보기
- 박시연;
- 이기용
WEB OF SCIENCE
0SCOPUS
0초록
This paper proposes Ppormer (PPR-based Diffusion Transformer), a novel graph neural network designed to improve node representationlearning on weighted graphs. Ppormer incorporates three complementary types of information into the node representation process: (1)Global contextual information, captured via diffusion-based self-attention mechanisms that model semantic relationships across the entiregraph; (2) Local structural information, obtained through Graph Convolutional Network (GCN)-based message passing over immediateneighbors; and (3) Global structural information, derived from Personalized PageRank (PPR) to reflect long-range topological relevancebased on edge strength. These heterogeneous signals are adaptively fused using the proposed FusionAttention module. Experimentsconducted on the Cora dataset, with edge weights computed using Jaccard, Canberra, and Euclidean similarities, demonstrate that Ppormerachieves accuracy scores of 84.00%, 83.94%, and 85.54%, respectively—outperforming all baseline models. These results validate theeffectiveness and generalizability of Ppormer in various weighted graph scenarios.
키워드
- 제목
- Ppormer: 가중 그래프에 대한 효과적인 노드 표현 학습을 위한 PPR 기반 확산 트랜스포머
- 제목 (타언어)
- Ppormer: A PPR-based Diffusion Transformer for Effective Node Representation Learning on Weighted Graphs
- 저자
- 박시연; 이기용
- 발행일
- 2025-05
- 저널명
- 정보처리학회 논문지
- 권
- 14
- 호
- 5
- 페이지
- 372 ~ 378