Skip to content

BDeMo/pFedBreD_public

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Overview

The code of pFedBreD accepted by NeurIPS 2023. We provide 3 implementations of pFedBreD under Jaynes's Rule, pFedBreD_ns and 3 baselines in the main Table of comparative experiments.

Abstract

Classical federated learning (FL) enables training machine learning models without sharing data for privacy preservation, but heterogeneous data characteristic degrades the performance of the localized model. Personalized FL ( PFL) addresses this by synthesizing personalized models from a global model via training on local data. Such a global model may overlook the specific information that the clients have been sampled. In this paper, we propose a novel scheme to inject personalized prior knowledge into the global model in each client, which attempts to mitigate the introduced incomplete information problem in PFL. At the heart of our proposed approach is a framework, the PFL with Bregman Divergence (pFedBreD), decoupling the personalized prior from the local objective function regularized by Bregman divergence for greater adaptability in personalized scenarios. We also relax the mirror descent (RMD) to extract the prior explicitly to provide optional strategies...

Preprint

Arxiv Preprint

Openreview(Not available now)

image

Others About Paper

Poster:

image

Zhihu

Tutorial

Requirements and Dataset

To install requirements:

pip install -r requirements.txt

  1. Run data/*/generate_... first, if you have no data yet.
  2. Then try the default hyperparameter setting on what ever --totalepoch you want by running main_fl.

More scripts will be released soon.

PS: if there's any wired characters (e.g., fo/mfo/mg) are respectively the old name and unrelated (or additional) methods(e.g., of lg/meg/mh).

Optional Baselines

FedAvg FedProx pFedMe Per-FedAvg FedEM FedAMP pFedBayes Ditto Fedfomo FedHN FedPAC

Citation

If you use our code or wish to refer to our results, please use the following BibTex entry:

@inproceedings{shi2023prior, title={PRIOR: Personalized Prior for Reactivating the Information Overlooked in Federated Learning.}, author = {Mingjia Shi, Yuhao Zhou, Kai Wang, Huaizheng Zhang, Shudong Huang, Qing Ye, Jiancheng Lv}, boottitle = {Proceedings of the 37th NeurIPS}, year = {2023} }

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published