Search papers, labs, and topics across Lattice.
This paper introduces FedDBP, a federated prototype learning method designed to improve performance in heterogeneous federated learning scenarios. FedDBP employs a dual-branch feature projector on the client side, using L2 alignment and contrastive learning to balance feature fidelity and discriminability. On the server side, it uses Fisher information to personalize the fusion of global prototypes, emphasizing important channels from local prototypes. Experiments show FedDBP outperforms ten existing federated learning methods.
Stop averaging prototypes blindly: FedDBP uses Fisher information to intelligently fuse local prototypes, significantly boosting performance in heterogeneous federated learning.
Federated prototype learning (FPL), as a solution to heterogeneous federated learning (HFL), effectively alleviates the challenges of data and model heterogeneity.However, existing FPL methods fail to balance the fidelity and discriminability of the feature, and are limited by a single global prototype. In this paper, we propose FedDBP, a novel FPL method to address the above issues. On the client-side, we design a Dual-Branch feature projector that employs L2 alignment and contrastive learning simultaneously, thereby ensuring both the fidelity and discriminability of local features. On the server-side, we introduce a Personalized global prototype fusion approach that leverages Fisher information to identify the important channels of local prototypes. Extensive experiments demonstrate the superiority of FedDBP over ten existing advanced methods.