Skip to content

Everything about federated learning, including research papers, codes, tutorials, blogs and beyond

Notifications You must be signed in to change notification settings

hungnphan/Awesome-Federated-Machine-Learning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 

Repository files navigation

Awesome Federated Machine Learning Awesome

Federated Learning (FL) is a new machine learning framework, which enables multiple devices collaboratively to train a shared model without compromising data privacy and security.

FL

This repository will continue to be collected and updated everything about federated learning materials, including research papers, conferences, blogs and beyond.

Table of Contents

Top Machine Learning conferences

In this section, we will summarize Federated Learning papers accepted by top machine learning conference, Including Neurips, ICML, ICLR.

ICML

Conferences Title Affiliation Slide
&
Code
ICML 2021 Gradient Disaggregation: Breaking Privacy in Federated Learning by Reconstructing the User Participant Matrix Harvard University code
FL-NTK: A Neural Tangent Kernel-based Framework for Federated Learning Analysis Peking University; Yale University
Personalized Federated Learning using Hypernetworks Bar-Ilan University; NVIDIA Research code
materials
Federated Composite Optimization Stanford University; Google code
Exploiting Shared Representations for Personalized Federated Learning University of Texas at Austin; University of Pennsylvania code
Data-Free Knowledge Distillation for Heterogeneous Federated Learning Michigan State University code
Federated Continual Learning with Weighted Inter-client Transfer KAIST code
Federated Deep AUC Maximization for Hetergeneous Data with a Constant Communication Complexity The University of Iowa
Bias-Variance Reduced Local SGD for Less Heterogeneous Federated Learning The University of Tokyo
Federated Learning of User Verification Models Without Sharing Embeddings Qualcomm AI Research
Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning Accenture code
Ditto: Fair and Robust Federated Learning Through Personalization CMU; Facebook code
Heterogeneity for the Win: One-Shot Federated Clustering CMU code
The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation Google
Debiasing Model Updates for Improving Personalized Federated Training Boston University; Arm Research
One for One, or All for All: Equilibria and Optimality of Collaboration in Federated Learning Toyota Technological Institute of Chicago; University of California, Berkeley; Cornell University
CRFL: Certifiably Robust Federated Learning against Backdoor Attacks University of Illinois at Urbana-Champaign; IBM Research; Zhejiang University code
Federated Learning under Arbitrary Communication Patterns Indiana University, Bloomington; Amazon
ICML 2020 FedBoost: A Communication-Efficient Algorithm for Federated Learning Google Research Video
FetchSGD: Communication-Efficient Federated Learning with Sketching UC Berkeley;
Johns Hopkins University;
Amazon
Video
Code
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning EPFL;
Google Research
Video
Federated Learning with Only Positive Labels Google Research Video
From Local SGD to Local Fixed-Point Methods for Federated Learning Moscow Institute of Physics and Technology;
KAUST
Slide
Video
Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization KAUST Slide
Video
ICML 2019 Bayesian Nonparametric Federated Learning of Neural Networks IBM Research Code
Analyzing Federated Learning through an Adversarial Lens Princeton University;
IBM Research
Code
Agnostic Federated Learning Google Research

ICLR

Conferences Title Affiliation Slide
&
Code
ICLR 2021 Federated Learning Based on Dynamic Regularization Boston University; ARM
Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning The Ohio State University
HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients Duke University code
FedMix: Approximation of Mixup under Mean Augmented Federated Learning KAIST
Federated Learning via Posterior Averaging: A New Perspective and Practical Algorithms CMU; Google code
Adaptive Federated Optimization Google code
Personalized Federated Learning with First Order Model Optimization Stanford University; NVIDIA
FedBN: Federated Learning on Non-IID Features via Local Batch Normalization Princeton University code
FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning The Ohio State University
Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint Learning KAIST code
ICLR 2020 Federated Adversarial Domain Adaptation Boston University;
Columbia University;
Rutgers University
DBA: Distributed Backdoor Attacks against Federated Learning Zhejiang University;
IBM Research
Code
Fair Resource Allocation in Federated Learning CMU;
Facebook AI
Code
Federated Learning with Matched Averaging University of Wisconsin-Madison;
IBM Research
Code
Differentially Private Meta-Learning CMU
Generative Models for Effective ML on Private, Decentralized Datasets Google Code
On the Convergence of FedAvg on Non-IID Data Peking University Code

NeurIPS

Conferences Title Affiliation Slide
&
Code
NeurIPS 2020 Differentially-Private Federated Linear Bandits MIT code
Federated Principal Component Analysis University of Cambridge;
Quine Technologies
code
FedSplit: an algorithmic framework for fast federated optimization UC Berkeley
Federated Bayesian Optimization via Thompson Sampling NUS; MIT
Lower Bounds and Optimal Algorithms for Personalized Federated Learning KAUST
Robust Federated Learning: The Case of Affine Distribution Shifts UC Santa Barbara; MIT
An Efficient Framework for Clustered Federated Learning UC Berkeley; DeepMind Code
Distributionally Robust Federated Averaging Pennsylvania State University Code
Personalized Federated Learning with Moreau Envelopes The University of Sydney code
Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach MIT; UT Austin
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge University of Southern California code
Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization CMU;
Princeton University
Attack of the Tails: Yes, You Really Can Backdoor Federated Learning University of Wisconsin-Madison
Federated Accelerated Stochastic Gradient Descent Stanford University code
Inverting Gradients - How easy is it to break privacy in federated learning? University of Siegen code
Ensemble Distillation for Robust Model Fusion in Federated Learning EPFL
Throughput-Optimal Topology Design for Cross-Silo Federated Learning INRIA code

Others

Conferences Title Affiliation Slide
&
Code
AISTATS 2020 FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization UC Santa Barbara; UT Austin Supplementary
How To Backdoor Federated Learning Cornell Tech Supplementary
Federated Heavy Hitters Discovery with Differential Privacy RPI;
Google
Supplementary

Top CV conferences

In this section, we will summarize Federated Learning papers accepted by top computer vision conference, Including CVPR, ICCV, ECCV.

CVPR

Conferences Title Affiliation Slide
&
Code
CVPR 2021 Multi-Institutional Collaborations for Improving Deep Learning-Based Magnetic Resonance Image Reconstruction Using Federated Learning Johns Hopkins University code
Model-Contrastive Federated Learning National University of Singapore;
UC Berkeley
code
FedDG: Federated Domain Generalization on Medical Image Segmentation via Episodic Learning in Continuous Frequency Space The Chinese University of Hong Kong code
Soteria: Provable Defense Against Privacy Leakage in Federated Learning From Representation Perspective Duke University code

ECCV

Conferences Title Affiliation Slide
&
Code
ECCV 2020 Federated Visual Classification with Real-World Data Distribution MIT;
Google
Video

Books

Papers

1. Personalization

Personalized federated learning refers to train a model for each client, based on the client’s own dataset and the datasets of other clients. There are two major motivations for personalized federated learning:

  • Due to statistical heterogeneity across clients, a single global model would not be a good choice for all clients. Sometimes, the local models trained solely on their private data perform better than the global shared model.
  • Different clients need models specifically customized to their own environment. As an example of model heterogeneity, consider the sentence: “I live in .....”. The next-word prediction task applied on this sentence needs to predict a different answer customized for each user. Different clients may assign different labels to the same data.

Personalized federated learning Survey paper:

Methodology Title Conferences Slide
&
Code
Multi-Task Learning Federated Multi-Task Learning Stanford
USC
CMU
Variational Federated Multi-Task Learning ETH Zurich
Meta Learning Personalized Federated Learning: A Meta-Learning Approach MIT
Improving Federated Learning Personalization via Model Agnostic Meta Learning University of Washington;
Google
Adaptive Gradient-Based Meta-Learning Methods CMU
Federated Meta-Learning with Fast Convergence and Efficient Communication Huawei Noah’s Ark Lab
Mixture of Global and Local Models Federated Learning of a Mixture of Global and Local Models KAUST
Federated User Representation Learning University of Michigan
Facebook
Adaptive Personalized Federated Learning The Pennsylvania State University
Personalization Layers Federated Learning with Personalization Layers Adobe Research
Indian Institute of Technology
Think Locally, Act Globally: Federated Learning with Local and Global Representations CMU
University of Tokyo
Columbia University
Transfer Learning Federated evaluation of on-device personalization Google
Salvaging Federated Learning by Local Adaptation Cornell University
Private Federated Learning with Domain Adaptation Oracle Labs
Clustering Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints Fraunhofer Heinrich Hertz Institute Code
An Efficient Framework for Clustered Federated Learning UC Berkeley
DeepMind
Code
Robust Federated Learning in a Heterogeneous Environment UC Berkeley

2. Recommender system

Recommender system (RecSys) is widely used to solve information overload. In general, the more data RecSys use, the better the recommendation performance we can obtain.

Traditionally, RecSys requires the data that are distributed across multiple devices to be uploaded to the central database for model training. However, due to privacy and security concerns, such directly sharing user data strategies are no longer appropriate.

The incorporation of federated learning and RecSys is a promising approach, which can alleviate the risk of privacy leakage.

Methodology Title Conferences Slide
&
Code
Matrix Factorization Secure federated matrix factorization IEEE Intelligent Systems
Federated Multi-view Matrix Factorization for Personalized Recommendations ECML-PKDD 2020 video
Decentralized Recommendation Based on Matrix Factorization: A Comparison of Gossip and Federated Learning ECML-PKDD 2019
Towards Privacy-preserving Mobile Applications with Federated Learning: The Case of Matrix Factorization MobiSys 2019
Meta Matrix Factorization for Federated Rating Predictions ACM SIGIR 2020 code
Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System Arxiv
GNN FedGNN: Federated Graph Neural Network for Privacy-Preserving Recommendation Arxiv

 

3. Security

3.1. Attack

Methodology Title Conferences Slide
&
Code
Backdoor Attack How To Backdoor Federated Learning AISTATS 2020 code
Can You Really Backdoor Federated Learning? Arxiv
Attack of the Tails: Yes, You Really Can Backdoor Federated Learning NeurIPS 2020 code
DBA: Distributed Backdoor Attacks against Federated Learning ICLR 2020 code

 

3.2. Denfense

Methodology Title Conferences Slide
&
Code
Differential Privacy Federated Learning With Differential Privacy: Algorithms and Performance Analysis IEEE Transactions on Information Forensics and Security
Differentially Private Federated Learning: A Client Level Perspective Arxiv code
Learning Differentially Private Recurrent Language Models ICLR 2018
Homomorphic Encryption Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption Arxiv
BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning USENIX 2020 code
A Little Is Enough: Circumventing Defenses For Distributed Learning NeurIPS 2019

 

4. Survey

Category Title
General Federated machine learning: Concept and applications
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
Advances and Open Problems in Federated Learning
Federated Learning: Challenges, Methods, and Future Directions
Security A survey on security and privacy of federated learning
Threats to Federated Learning: A Survey
Vulnerabilities in Federated Learning
Personalization Survey of Personalization Techniques for Federated Learning
Threats to Federated Learning: A Survey
Aggregation Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning
Incentive A Comprehensive Survey of Incentive Mechanism for Federated Learning
A Survey of Incentive Mechanism Design for Federated Learning
Applications A Survey on Federated Learning and its Applications for Accelerating Industrial Internet of Things

 

5. System Design

System Ppaer
Tensorflow-Federated Towards Federated Learning at Scale: System Design
PySyft A generic framework for privacy preserving deep learning
FedML FedML: A Research Library and Benchmark for Federated Machine Learning
OpenFL OpenFL: An open-source framework for Federated Learning

 

6. Communication-Efficient

 

7. Optimization

 

8. Fairness

 

9. Applications

Applications Title Company Slide
&
Code
Computer Vision FedVision: An Online Visual Object Detection Platform Powered by Federated Learning WeBank (AAAI 2020) code
Nature Language Processing Federated learning for emoji prediction in a mobile keyboard Google
Federated Learning for Mobile Keyboard Prediction Google
Applied federated learning: Improving google keyboard query suggestions Google
Federated Learning Of Out-Of-Vocabulary Words Google
Automatic Speech Recognition A Federated Approach in Training Acoustic Models MicroSoft (INTERSPEECH 2020) Video
Privacy-Preserving Adversarial Representation Learning in ASR: Reality or Illusion? INRIA (INTERSPEECH 2019)
Training Speech Recognition Models with Federated Learning: A Quality/Cost Framework Google (ICASSP 2021) Google Assistant Help
Federated Evaluation and Tuning for On-Device Personalization: System Design \& Applications Apple Report
Healthcare Privacy-preserving Federated Brain Tumour Segmentation NVIDIA (MICCAI MLMI 2019)
Advancing health research with Google Health Studies Google Blog
Multi-institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation Intel Blog
Blockchain FedCoin: A Peer-to-Peer Payment System for Federated Learning Arxiv
Blockchained On-Device Federated Learning IEEE Communications Letters 2019

 

Talks and Tutorials

 

Conferences and Workshops

 

Blogs

 

Open-Sources

⬆ Return to top

About

Everything about federated learning, including research papers, codes, tutorials, blogs and beyond

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published