Skip to content

Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond

Notifications You must be signed in to change notification settings

tahmidabir/Awesome-Federated-Machine-Learning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

81 Commits
 
 
 
 
 
 

Repository files navigation

Awesome Federated Machine Learning Awesome

Federated Learning (FL) is a new machine learning framework, which enables multiple devices collaboratively to train a shared model without compromising data privacy and security.

FL

This repository will continue to be collected and updated everything about federated learning materials, including research papers, conferences, blogs and beyond.

Table of Contents

Top Machine Learning conferences

In this section, we will summarize Federated Learning papers accepted by top machine learning conference, Including NeurIPS, ICML, ICLR.

ICML

Conferences Title Affiliation Slide
&
Code
ICML 2021 Gradient Disaggregation: Breaking Privacy in Federated Learning by Reconstructing the User Participant Matrix Harvard University code
FL-NTK: A Neural Tangent Kernel-based Framework for Federated Learning Analysis Peking University; Yale University
Personalized Federated Learning using Hypernetworks Bar-Ilan University; NVIDIA Research code
materials
Federated Composite Optimization Stanford University; Google code
Exploiting Shared Representations for Personalized Federated Learning University of Texas at Austin; University of Pennsylvania code
Data-Free Knowledge Distillation for Heterogeneous Federated Learning Michigan State University code
Federated Continual Learning with Weighted Inter-client Transfer KAIST code
Federated Deep AUC Maximization for Hetergeneous Data with a Constant Communication Complexity The University of Iowa
Bias-Variance Reduced Local SGD for Less Heterogeneous Federated Learning The University of Tokyo
Federated Learning of User Verification Models Without Sharing Embeddings Qualcomm AI Research
Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning Accenture code
Ditto: Fair and Robust Federated Learning Through Personalization CMU; Facebook AI code
video
Heterogeneity for the Win: One-Shot Federated Clustering CMU code
The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation Google
Debiasing Model Updates for Improving Personalized Federated Training Boston University; Arm Research
One for One, or All for All: Equilibria and Optimality of Collaboration in Federated Learning Toyota Technological Institute of Chicago; University of California, Berkeley; Cornell University
CRFL: Certifiably Robust Federated Learning against Backdoor Attacks University of Illinois at Urbana-Champaign; IBM Research; Zhejiang University code
Federated Learning under Arbitrary Communication Patterns Indiana University, Bloomington; Amazon
ICML 2020 FedBoost: A Communication-Efficient Algorithm for Federated Learning Google Research Video
FetchSGD: Communication-Efficient Federated Learning with Sketching UC Berkeley;
Johns Hopkins University;
Amazon
Video
Code
SCAFFOLD: Stochastic Controlled Averaging for Federated Learning EPFL;
Google Research
Video
Federated Learning with Only Positive Labels Google Research Video
From Local SGD to Local Fixed-Point Methods for Federated Learning Moscow Institute of Physics and Technology;
KAUST
Slide
Video
Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization KAUST Slide
Video
ICML 2019 Bayesian Nonparametric Federated Learning of Neural Networks IBM Research Code
Analyzing Federated Learning through an Adversarial Lens Princeton University;
IBM Research
Code
Agnostic Federated Learning Google Research

ICLR

Conferences Title Affiliation Slide
&
Code
ICLR 2021 Federated Learning Based on Dynamic Regularization Boston University; ARM
Achieving Linear Speedup with Partial Worker Participation in Non-IID Federated Learning The Ohio State University
HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients Duke University code
FedMix: Approximation of Mixup under Mean Augmented Federated Learning KAIST
Federated Learning via Posterior Averaging: A New Perspective and Practical Algorithms CMU; Google code
Adaptive Federated Optimization Google code
Personalized Federated Learning with First Order Model Optimization Stanford University; NVIDIA
FedBN: Federated Learning on Non-IID Features via Local Batch Normalization Princeton University code
FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning The Ohio State University
Federated Semi-Supervised Learning with Inter-Client Consistency & Disjoint Learning KAIST code
ICLR 2020 Federated Adversarial Domain Adaptation Boston University;
Columbia University;
Rutgers University
DBA: Distributed Backdoor Attacks against Federated Learning Zhejiang University;
IBM Research
Code
Fair Resource Allocation in Federated Learning CMU;
Facebook AI
Code
Federated Learning with Matched Averaging University of Wisconsin-Madison;
IBM Research
Code
Differentially Private Meta-Learning CMU
Generative Models for Effective ML on Private, Decentralized Datasets Google Code
On the Convergence of FedAvg on Non-IID Data Peking University Code

NeurIPS

Conferences Title Affiliation Slide
&
Code
NeurIPS 2021 Sageflow: Robust Federated Learning against Both Stragglers and Adversaries KAIST supplementary
Catastrophic Data Leakage in Vertical Federated Learning Rensselaer Polytechnic Institute;
IBM Research
code
supplementary
Fault-Tolerant Federated Reinforcement Learning with Theoretical Guarantee NUS code
supplementary
Optimality and Stability in Federated Learning: A Game-theoretic Approach Cornell University code
supplementary
QuPeD: Quantized Personalization via Distillation with Applications to Federated Learning UCLA supplementary
The Skellam Mechanism for Differentially Private Federated Learning Google Research;
CMU
supplementary
No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data NUS;
Huawei Noah’s Ark Lab
supplementary
STEM: A Stochastic Two-Sided Momentum Algorithm Achieving Near-Optimal Sample and Communication Complexities for Federated Learning University of Minnesota supplementary
Subgraph Federated Learning with Missing Neighbor Generation Emory University;
University of British Columbia;
Lehigh University
supplementary
Evaluating Gradient Inversion Attacks and Defenses in Federated Learning Princeton University Code
supplementary
Personalized Federated Learning With Gaussian Processes Bar-Ilan University code
supplementary
Differentially Private Federated Bayesian Optimization with Distributed Exploration MIT;
NUS
code
supplementary
Parameterized Knowledge Transfer for Personalized Federated Learning Hong Kong Polytechnic University;
supplementary
Federated Reconstruction: Partially Local Federated Learning Google Research supplementary
Fast Federated Learning in the Presence of Arbitrary Device Unavailability Tsinghua University;
Princeton University;
MIT
code
supplementary
FL-WBC: Enhancing Robustness against Model Poisoning Attacks in Federated Learning from a Client Perspective Duke University;
Accenture Labs
code
supplementary
FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout KAUST;
Samsung AI Center
supplementary
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients University of Pennsylvania supplementary
Federated Multi-Task Learning under a Mixture of Distributions INRIA;
Accenture Labs
code
supplementary
Federated Graph Classification over Non-IID Graphs Emory University supplementary
Federated Hyperparameter Tuning: Challenges, Baselines, and Connections to Weight-Sharing CMU;
Hewlett Packard Enterprise
code
supplementary
On Large-Cohort Training for Federated Learning Google;
CMU
code
supplementary
DeepReduce: A Sparse-tensor Communication Framework for Federated Deep Learning KAUST;
Columbia University;
University of Central Florida
code
supplementary
PartialFed: Cross-Domain Personalized Federated Learning via Partial Initialization Huawei supplementary
Federated Split Task-Agnostic Vision Transformer for COVID-19 CXR Diagnosis KAIST supplementary
Addressing Algorithmic Disparity and Performance Inconsistency in Federated Learning Tsinghua University;
Alibaba;
Weill Cornell Medicine
code
supplementary
Federated Linear Contextual Bandits The Pennsylvania State University;
Facebook;
University of Virginia
supplementary
Few-Round Learning for Federated Learning KAIST supplementary
Breaking the centralized barrier for cross-device federated learning EPFL;
Google Research
code
supplementary
Federated-EM with heterogeneity mitigation and variance reduction Ecole Polytechnique;
Google Research
supplementary
Delayed Gradient Averaging: Tolerate the Communication Latency for Federated Learning MIT;
Amazon;
Google
supplementary
FedDR – Randomized Douglas-Rachford Splitting Algorithms for Nonconvex Federated Composite Optimization University of North Carolina at Chapel Hill;
IBM Research
code
supplementary
Gradient Inversion with Generative Image Prior Pohang University of Science and Technology;
University of Wisconsin-Madison;
University of Washington
code
supplementary
NeurIPS 2020 Differentially-Private Federated Linear Bandits MIT code
Federated Principal Component Analysis University of Cambridge;
Quine Technologies
code
FedSplit: an algorithmic framework for fast federated optimization UC Berkeley
Federated Bayesian Optimization via Thompson Sampling NUS; MIT
Lower Bounds and Optimal Algorithms for Personalized Federated Learning KAUST
Robust Federated Learning: The Case of Affine Distribution Shifts UC Santa Barbara; MIT
An Efficient Framework for Clustered Federated Learning UC Berkeley; DeepMind Code
Distributionally Robust Federated Averaging Pennsylvania State University Code
Personalized Federated Learning with Moreau Envelopes The University of Sydney code
Personalized Federated Learning with Theoretical Guarantees: A Model-Agnostic Meta-Learning Approach MIT; UT Austin
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge University of Southern California code
Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization CMU;
Princeton University
Attack of the Tails: Yes, You Really Can Backdoor Federated Learning University of Wisconsin-Madison
Federated Accelerated Stochastic Gradient Descent Stanford University code
Inverting Gradients - How easy is it to break privacy in federated learning? University of Siegen code
Ensemble Distillation for Robust Model Fusion in Federated Learning EPFL
Throughput-Optimal Topology Design for Cross-Silo Federated Learning INRIA code
NeurIPS 2017 Federated Multi-Task Learning Stanford;
USC;
CMU
code

Top Computer Vision conferences

In this section, we will summarize Federated Learning papers accepted by top computer vision conference, Including CVPR, ICCV, ECCV.

CVPR

Conferences Title Affiliation Slide
&
Code
CVPR 2021 Multi-Institutional Collaborations for Improving Deep Learning-Based Magnetic Resonance Image Reconstruction Using Federated Learning Johns Hopkins University code
Model-Contrastive Federated Learning National University of Singapore;
UC Berkeley
code
FedDG: Federated Domain Generalization on Medical Image Segmentation via Episodic Learning in Continuous Frequency Space The Chinese University of Hong Kong code
Soteria: Provable Defense Against Privacy Leakage in Federated Learning From Representation Perspective Duke University code

ECCV

Conferences Title Affiliation Slide
&
Code
ECCV 2020 Federated Visual Classification with Real-World Data Distribution MIT;
Google
Video

ICCV

Conferences Title Affiliation Slide
&
Code
ICCV 2021 Federated Learning for Non-IID Data via Unified Feature Learning and Optimization Objective Alignment Peking University
Ensemble Attention Distillation for Privacy-Preserving Federated Learning University at Buffalo

Top Artificial Intelligence and Data Mining conferences

In this section, we will summarize Federated Learning papers accepted by top AI and DM conference, Including AAAI, AISTATS, KDD.

AAAI

Conferences Title Affiliation Slide
&
Code
AAAI 2021 Secure Bilevel Asynchronous Vertical Federated Learning with Backward Updating Xidian University;
JD Tech
video
FedRec++: Lossless Federated Recommendation with Explicit Feedback Shenzhen University video
Federated Multi-Armed Bandits University of Virginia code
video
On the Convergence of Communication-Efficient Local SGD for Federated Learning Temple University;
University of Pittsburgh
video
FLAME: Differentially Private Federated Learning in the Shuffle Model Renmin University of China;
Kyoto University
video
code
Toward Understanding the Influence of Individual Clients in Federated Learning Shanghai Jiao Tong University;
The University of Texas at Dallas
video
Provably Secure Federated Learning against Malicious Clients Duke University video
slides
Personalized Cross-Silo Federated Learning on Non-IID Data Simon Fraser University;
McMaster University
video
Model-Sharing Games: Analyzing Federated Learning under Voluntary Participation Cornell University code
video
Curse or Redemption? How Data Heterogeneity Affects the Robustness of Federated Learning University of Nevada;
IBM Research
video
Game of Gradients: Mitigating Irrelevant Clients in Federated Learning IIT Bombay;
IBM Research
video
Supplementary
Federated Block Coordinate Descent Scheme for Learning Global and Personalized Models The Chinese University of Hong Kong;
Arizona State University
video
code
Adressing Class Imbalance in Federated Learning Northwestern University video
code
Defending against Backdoors in Federated Learning with Robust Learning Rate The University of Texas at Dallas video
code
AAAI 2020 Practical Federated Gradient Boosting Decision Trees National University of Singapore;
The University of Western Australia
code
Federated Learning for Vision-and-Language Grounding Problems Peking University;
Tencent
Federated Latent Dirichlet Allocation: A Local Differential Privacy Based Framework Beihang University
Federated Patient Hashing Cornell University
Robust Federated Learning via Collaborative Machine Teaching Symantec Research Labs;
KAUST

AISTATS

Conferences Title Affiliation Slide
&
Code
AISTATS 2021 Free-rider Attacks on Model Aggregation in Federated Learning Accenture Labs video
Supplementary
Federated f-differential privacy University of Pennsylvania code
video
Supplementary
Federated learning with compression: Unified analysis and sharp guarantees The Pennsylvania State University;
The University of Texas at Austin
code
video
Supplementary
Shuffled Model of Differential Privacy in Federated Learning UCLA;
Google
video
Supplementary
Convergence and Accuracy Trade-Offs in Federated Learning and Meta-Learning Google video
Supplementary
Federated Multi-armed Bandits with Personalization University of Virginia;
The Pennsylvania State University
code
video
Supplementary
Towards Flexible Device Participation in Federated Learning CMU;
Sun Yat-Sen University
video
Supplementary
AISTATS 2020 FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization UC Santa Barbara;
UT Austin
video
Supplementary
How To Backdoor Federated Learning Cornell Tech video
code
Supplementary
Federated Heavy Hitters Discovery with Differential Privacy RPI;
Google
video
Supplementary

KDD

Conferences Sessions Title Affiliation Slide
&
Code
KDD 2021 Research Track Fed2: Feature-Aligned Federated Learning George Mason University;
Microsoft;
University of Maryland
FedRS: Federated Learning with Restricted Softmax for Label Distribution Non-IID Data Nanjing University
Federated Adversarial Debiasing for Fair and Trasnferable Representations Michigan State University HomePage
Cross-Node Federated Graph Neural Network for Spatio-Temporal Data Modeling University of Southern California code
Application Track AsySQN: Faster Vertical Federated Learning Algorithms with Better Computation Resource Utilization
FLOP: Federated Learning on Medical Datasets using Partial Networks Duke University code
KDD 2020 Research Track FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems University College Dublin video
Application Track Federated Doubly Stochastic Kernel Learning for Vertically Partitioned Data JD Tech video

Books

Papers

1. Model Aggregation

Model Aggregation (or Model Fusion) refers to how to combine local models into a shared global model.

Title Abbreviation Conferences Slide
&
Code
Communication-Efficient Learning of Deep Networks from Decentralized Data FedAvg ASTATS 2017
Bayesian Nonparametric Federated Learning of Neural Networks PFNM ICML 2019 code
Machine Learning with Adversaries: Byzantine Tolerant Gradient Descent Krum NeurIPS 2017
Byzantine-Robust Distributed Learning: Towards Optimal Statistical Rates median;
trimmed mean
ICML 2018
Distributed Training with Heterogeneous Data: Bridging Median- and Mean-Based Algorithms median;
mean
NeurIPS 2020
The hidden vulnerability of distributed learning in byzantium Bulyan ICML 2018
Zeno: Distributed Stochastic Gradient Descent with Suspicion-based Fault-tolerance Zeno ICML 2019 code
Statistical Model Aggregation via Parameter Matching SPAHM NeurIPS 2019 code
Fed+: A Unified Approach to Robust Personalized Federated Learning Fed+
FEDERATED OPTIMIZATION IN HETEROGENEOUS NETWORKS FedProx MLSys 2020 code
Separation of Powers in Federated Learning Truda

 

2. Personalization

Personalized federated learning refers to train a model for each client, based on the client’s own dataset and the datasets of other clients. There are two major motivations for personalized federated learning:

  • Due to statistical heterogeneity across clients, a single global model would not be a good choice for all clients. Sometimes, the local models trained solely on their private data perform better than the global shared model.
  • Different clients need models specifically customized to their own environment. As an example of model heterogeneity, consider the sentence: “I live in .....”. The next-word prediction task applied on this sentence needs to predict a different answer customized for each user. Different clients may assign different labels to the same data.

Personalized federated learning Survey paper:

Methodology Title Conferences Slide
&
Code
Multi-Task Learning Federated Multi-Task Learning NeurIPS 2017
(Stanford; USC; CMU)
code
Decentralized Collaborative Learning of Personalized Models over Networks AISTATS 2017
(INRIA)
Variational Federated Multi-Task Learning ETH Zurich
Fully Decentralized Joint Learning of Personalized Models and Collaboration Graphs AISTATS 2020
(INRIA)
video
Personalized Cross-Silo Federated Learning on Non-IID Data AAAI 2021
(Simon Fraser University; McMaster University; Huawei Technologies Canada)
video
Ditto: Fair and Robust Federated Learning Through Personalization ICML 2021
(CMU; Facebook AI)
code
video
Federated Multi-Task Learning under a Mixture of Distributions NeurIPS 2021
(Inria; Accenture Labs)
code
Meta Learning Personalized Federated Learning: A Meta-Learning Approach MIT
Improving Federated Learning Personalization via Model Agnostic Meta Learning University of Washington;
Google
Adaptive Gradient-Based Meta-Learning Methods CMU
Federated Meta-Learning with Fast Convergence and Efficient Communication Huawei Noah’s Ark Lab
Mixture of Global and Local Models Federated Learning of a Mixture of Global and Local Models KAUST
Federated User Representation Learning University of Michigan
Facebook
Adaptive Personalized Federated Learning The Pennsylvania State University
Personalization Layers Federated Learning with Personalization Layers Adobe Research
Indian Institute of Technology
Think Locally, Act Globally: Federated Learning with Local and Global Representations CMU
University of Tokyo
Columbia University
Transfer Learning Federated evaluation of on-device personalization Google
Salvaging Federated Learning by Local Adaptation Cornell University
Private Federated Learning with Domain Adaptation Oracle Labs
Clustering Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints Fraunhofer Heinrich Hertz Institute Code
An Efficient Framework for Clustered Federated Learning UC Berkeley
DeepMind
Code
Robust Federated Learning in a Heterogeneous Environment UC Berkeley

 

3. Recommender system

Recommender system (RecSys) is widely used to solve information overload. In general, the more data RecSys use, the better the recommendation performance we can obtain.

Traditionally, RecSys requires the data that are distributed across multiple devices to be uploaded to the central database for model training. However, due to privacy and security concerns, such directly sharing user data strategies are no longer appropriate.

The incorporation of federated learning and RecSys is a promising approach, which can alleviate the risk of privacy leakage.

Methodology Title Conferences Slide
&
Code
Matrix Factorization Secure federated matrix factorization IEEE Intelligent Systems
Federated Multi-view Matrix Factorization for Personalized Recommendations ECML-PKDD 2020 video
Decentralized Recommendation Based on Matrix Factorization: A Comparison of Gossip and Federated Learning ECML-PKDD 2019
Towards Privacy-preserving Mobile Applications with Federated Learning: The Case of Matrix Factorization MobiSys 2019
Meta Matrix Factorization for Federated Rating Predictions ACM SIGIR 2020 code
Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System Arxiv
GNN FedGNN: Federated Graph Neural Network for Privacy-Preserving Recommendation Arxiv

 

4. Security

4.1. Attack

Methodology Title Conferences Slide
&
Code
Backdoor Attack How To Backdoor Federated Learning AISTATS 2020 code
Can You Really Backdoor Federated Learning? Arxiv
Attack of the Tails: Yes, You Really Can Backdoor Federated Learning NeurIPS 2020 code
DBA: Distributed Backdoor Attacks against Federated Learning ICLR 2020 code

 

4.2. Defense

Methodology Title Conferences Slide
&
Code
FL+DP Federated Learning With Differential Privacy: Algorithms and Performance Analysis IEEE Transactions on Information Forensics and Security
Differentially Private Federated Learning: A Client Level Perspective Arxiv code
Learning Differentially Private Recurrent Language Models ICLR 2018
FL+HE Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption Arxiv
BatchCrypt: Efficient Homomorphic Encryption for Cross-Silo Federated Learning USENIX 2020 code
FL+TEE PPFL: Privacy-preserving Federated Learning with Trusted Execution Environments ACM MobiSys 2021
Darknetz: towards model privacy at the edge using trusted execution environments. ACM MobiSys 2020 code
video
A Little Is Enough: Circumventing Defenses For Distributed Learning NeurIPS 2019

 

5. Survey

Category Title
General Federated machine learning: Concept and applications
A Survey on Federated Learning Systems: Vision, Hype and Reality for Data Privacy and Protection
Federated Learning in Mobile Edge Networks: A Comprehensive Survey
Advances and Open Problems in Federated Learning
Federated Learning: Challenges, Methods, and Future Directions
Security A survey on security and privacy of federated learning
Threats to Federated Learning: A Survey
Vulnerabilities in Federated Learning
Personalization Survey of Personalization Techniques for Federated Learning
Threats to Federated Learning: A Survey
Aggregation Emerging Trends in Federated Learning: From Model Fusion to Federated X Learning
Incentive A Comprehensive Survey of Incentive Mechanism for Federated Learning
A Survey of Incentive Mechanism Design for Federated Learning
Applications A Survey on Federated Learning and its Applications for Accelerating Industrial Internet of Things

 

6. System Design

Project Paper Affiliation
Tensorflow-Federated Towards Federated Learning at Scale: System Design Google
PySyft A generic framework for privacy preserving deep learning OpenMined
FedML FedML: A Research Library and Benchmark for Federated Machine Learning fedml.ai
OpenFL OpenFL: An open-source framework for Federated Learning Intel
Clara NVIDIA
IBM Federated Learning IBM Federated Learning: an Enterprise Framework White Paper IBM
FATE FATE: An Industrial Grade Platform for Collaborative Learning With Data Protection WeBank
Fedlearner Bytedance
Flower Flower: A Friendly Federated Learning Research Framework flower.dev
PaddleFL Baidu
LEAF LEAF: A Benchmark for Federated Settings CMU
PyVertical PyVertical: A Vertical Federated Learning Framework for Multi-headed SplitNN OpenMined
Sherpa.ai Federated Learning Sherpa.ai

 

7. Communication-Efficient

 

8. Optimization

 

9. Fairness

 

10. Applications

Applications Title Company Slide
&
Code
Computer Vision FedVision: An Online Visual Object Detection Platform Powered by Federated Learning WeBank (AAAI 2020) code
Nature Language Processing Federated learning for emoji prediction in a mobile keyboard Google
Federated Learning for Mobile Keyboard Prediction Google
Applied federated learning: Improving google keyboard query suggestions Google
Federated Learning Of Out-Of-Vocabulary Words Google
Automatic Speech Recognition A Federated Approach in Training Acoustic Models MicroSoft (INTERSPEECH 2020) Video
Privacy-Preserving Adversarial Representation Learning in ASR: Reality or Illusion? INRIA (INTERSPEECH 2019)
Training Speech Recognition Models with Federated Learning: A Quality/Cost Framework Google (ICASSP 2021) Google Assistant Help
Federated Evaluation and Tuning for On-Device Personalization: System Design \& Applications Apple Report
Healthcare Privacy-preserving Federated Brain Tumour Segmentation NVIDIA (MICCAI MLMI 2019)
Advancing health research with Google Health Studies Google Blog
Multi-institutional Deep Learning Modeling Without Sharing Patient Data: A Feasibility Study on Brain Tumor Segmentation Intel Blog
Blockchain FedCoin: A Peer-to-Peer Payment System for Federated Learning Arxiv
Blockchained On-Device Federated Learning IEEE Communications Letters 2019

 

Talks and Tutorials

 

Conferences and Workshops

 

Blogs

 

Open-Sources

⬆ Return to top

About

Everything about federated learning, including research papers, books, codes, tutorials, videos and beyond

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%