Skip to content

Commit

Permalink
Add ConDistFL to dev branch (#1932)
Browse files Browse the repository at this point in the history
  • Loading branch information
holgerroth committed Aug 16, 2023
1 parent db06578 commit 5aef697
Showing 1 changed file with 26 additions and 0 deletions.
26 changes: 26 additions & 0 deletions research/condist-fl/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data

You are looking at an old version. Please go [here](https://github.com/NVIDIA/NVFlare/tree/main/research/condist-fl) for the most current version of ConDistFL.

### ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data ([arXiv:2308.04070](https://arxiv.org/abs/2308.04070))
Accepted to the 4th Workshop on Distributed, Collaborative, & Federated Learning ([DeCaF](https://decaf-workshop.github.io/decaf-2023/)), Vancouver, October 12th, 2023.

###### Abstract:
> Developing a generalized segmentation model capable of simultaneously delineating multiple organs and diseases is highly desirable. Federated learning (FL) is a key technology enabling the collaborative development of a model without exchanging training data. However, the limited access to fully annotated training data poses a major challenge to training generalizable models. We propose "ConDistFL", a framework to solve this problem by combining FL with knowledge distillation. Local models can extract the knowledge of unlabeled organs and tumors from partially annotated data from the global model with an adequately designed conditional probability representation. We validate our framework on four distinct partially annotated abdominal CT datasets from the MSD and KiTS19 challenges. The experimental results show that the proposed framework significantly outperforms FedAvg and FedOpt baselines. Moreover, the performance on an external test dataset demonstrates superior generalizability compared to models trained on each dataset separately. Our ablation study suggests that ConDistFL can perform well without frequent aggregation, reducing the communication cost of FL.

## License
TBD

## Citation

> Wang, Pochuan, et al. "ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data." arXiv preprint arXiv:2308.04070 (2023).
BibTeX
```
@article{wang2023condistfl,
title={ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data},
author={Wang, Pochuan and Shen, Chen and Wang, Weichung and Oda, Masahiro and Fuh, Chiou-Shann and Mori, Kensaku and Roth, Holger R},
journal={arXiv preprint arXiv:2308.04070},
year={2023}
}
```

0 comments on commit 5aef697

Please sign in to comment.