상세 보기
- Seo, Jungwon;
- Catak, Ferhat Ozgur;
- Rong, Chunming;
- Hong, Kibeom;
- Kim, Minhoe
WEB OF SCIENCE
0SCOPUS
0초록
Federated Learning (FL) enables privacy-preserving multi-source information fusion (MSIF) but suffers from client drift in highly heterogeneous data settings. Many existing approaches mitigate drift by providing clients with common reference points, typically derived from past information, to align objectives or gradient directions. However, under severe partial participation, such history-dependent references may become unreliable, as the set of client data distributions participating in each round can vary drastically. To overcome this limitation, we propose a method that mitigates client drift without relying on past information by constraining the update space through Gradient Centralization (GC). Specifically, we introduce Local GC and Global GC, which apply GC at the local and global update stages, respectively, and further present GC-Fed, a hybrid formulation that generalizes both. Theoretical analysis and extensive experiments on benchmark FL tasks demonstrate that GC Fed effectively alleviates client drift and achieves up to 20% accuracy improvement under data heterogeneous and partial participation conditions.
키워드
- 제목
- GC-Fed: Gradient centralized federated learning with partial client participation
- 저자
- Seo, Jungwon; Catak, Ferhat Ozgur; Rong, Chunming; Hong, Kibeom; Kim, Minhoe
- 발행일
- 2026-07
- 유형
- Article
- 권
- 131