Publications
* indicates equal contribution
Selected publications
FedSRD: Sparsify-Reconstruct-Decompose for Communication-Efficient Federated Large Language Models Fine-Tuning (WWW 2026)
Guochen Yan, Luyuan Xie, Qingni Shen, Yuejian Fang, Zhonghai Wu
ACM arXiv Code
FedSRD (Sparsify-Reconstruct-Decompose) improves communication-efficient federated fine-tuning for LLMs by pruning client updates with an importance-aware strategy, reconstructing in full-rank space for robust aggregation under non-IID data, then decomposing back to sparse low-rank updates for broadcast.
FedVCK: Non-IID Robust and Communication-Efficient Federated Learning via Valuable Condensed Knowledge for Medical Image Analysis (AAAI 2025)
Guochen Yan, Luyuan Xie, Xinyi Gao, Wentao Zhang, Qingni Shen, Yuejian Fang, Zhonghai Wu
AAAI arXiv Code
FedVCK aggregates condensed knowledge distilled from each client’s local dataset. It enforces latent distribution constraints and uses hard-sample selection to improve robustness and communication efficiency under severe non-IID settings.
NPA: Improving Large-scale Graph Neural Networks with Non-parametric Attention (SIGMOD/PODS 2024)
Wentao Zhang*, Guochen Yan*, Yu Shen, Yang Ling, Yangyu Tao, Bin Cui, Jian Tang
ACM Code
NPA is a plug-and-play non-parametric attention module for scalable GNNs. It models feature relations within neighborhoods and across propagation steps to support better propagation and mitigate over-smoothing.
A novel open-set clustering algorithm (Information Sciences 2023)
Qi Li*, Guochen Yan*, Shuliang Wang, Boxiang Zhao
ScienceDirect Code
This work transforms cluster identification into irregular set identification. The algorithm is robust across data distributions, adapts to overlapping and Gaussian clustering, detects outliers, and performs strongly on real-world datasets.
