Skip to content

FedCD: Improving Performance in non-IID Federated Learning

Notifications You must be signed in to change notification settings

jessijzhao/FedCD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

FedCD: Improving Performance in non-IID Federated Learning

Presented at the KDD’20 workshop on Artificial Intelligence of Things (AIoT) on August 24 2020. Link to paper here.

Note: Code is currently being updated.

Abstract

Federated learning has been widely applied to enable decentralized devices, which each have their own local data, to learn a shared model. However, learning from real-world data can be challenging as it is rarely identically and independently distributed (IID) across edge devices (a key assumption for current high-performing and low-bandwidth algorithms). We present a novel approach, FedCD, which clones and deletes models to dynamically group devices with similar data. Experiments on the CIFAR-10 dataset show that FedCD achieves higher accuracy and faster convergence comparedto a FedAvg baseline on non-IID data while incurring minimal computation, communication, and storage overheads.

About

FedCD: Improving Performance in non-IID Federated Learning

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published