상세 보기
- 송지연;
- 이기용
WEB OF SCIENCE
0SCOPUS
0초록
Graph Neural Networks(GNNs) are typically trained on static graphs whose structures remain unchanged. However in real-worldscenarios, graphs expand as new nodes and edges are added, requiring the model to be retrained on the entire graph to reflect theseupdates. The fine-tuning method, often used to mitigate the inefficiency of full retraining, also faces limitations when applied to GNNs;due to the message passing mechanism, improvements in computational efficiency are restricted, and the performance on original nodesoften declines. To address these challenges, this paper proposes an efficient Graph Convolutional Networks(GCN) update for expandedsingle large graphs. The proposed method maximizes computational efficiency by decomposing the message passing operation into twocomponents: pre-computed information from the pre-training and newly required computations. Furthermore it alleviates performancedegradation on original nodes, a problem by setting all nodes in the expanded graph as the training target. Experimental results onreal-world datasets demonstrate that our proposed method significantly reduces training time compared to full retraining and fine-tuning,while maintaining a performance level comparable to that of full retraining.
키워드
- 제목
- 확장된 단일 대형그래프를 위한 효율적 그래프 합성곱 신경망 업데이트
- 제목 (타언어)
- Efficient Graph Convolutional Networks Update for Expanded Single Large Graph
- 저자
- 송지연; 이기용
- 발행일
- 2025-12
- 유형
- Y
- 저널명
- 정보처리학회 논문지
- 권
- 14
- 호
- 12
- 페이지
- 1084 ~ 1090