Publications
FedVCK: Non-IID Robust and Communication-Efficient Federated Learning via Valuable Condensed Knowledge for Medical Image Analysis
Published in AAAI 2025
The paper designs a novel federated learning method called FedVCK, which is non-IID robust and communication efficient. Departing from the conventional paradigm of model aggregation, FedVCK operates by aggregating condensed knowledge distilled from each client’s local dataset. The quality of this condensed knowledge is ensured through a distribution matching mechanism enforced by Latent Distribution Constraints. To further augment the value and diversity of the collective knowledge, each client strategically selects hard samples—instances that are challenging for the current global model—thereby addressing deficiencies in current model’s knowledge. Experimental results show FedVCK outperforms under severe non-IID scenarios under limited communication budget.
NPA: Improving Large-scale Graph Neural Networks with Non-parametric Attention
Published in SIGMOD/PODS 2024
The paper designs a plug-and-play non-parametric attention module called NPA to improve the performance of scalable GNNs. The key motivation is 1) consider the feature relationship between node itself and its neighborhood to support better propagation, 2) consider feature relationship between different propagation step in a node-adaptive manner to alleviate over-smoothing. Experiments show that NPA is compatible with most scalable GNN models and enable better performance, deeper architecture and high scalability.
A novel open-set clustering algorithm
Published in Information Sciences in 2023
The paper is about transforming cluster identification into irregular set identification. The proposed clustering algorithm is robust to various data distribution, more adaptive to overlapping and Gaussian clustering and more stable under different parameters, while preserves the ability to detect outliers. It also outperforms other baseline methods in real world datasets in accuracy, running time and parameter sensitivity.