Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could you support knowledge distillation-based FL algorithms like FedED, Fedmd, or FedDF #26

Open
Barry-Menglong-Yao opened this issue Jan 20, 2022 · 0 comments

Comments

@Barry-Menglong-Yao
Copy link

Barry-Menglong-Yao commented Jan 20, 2022

Currently, your platform supports some parameter-average-based FL algorithms. Could you support knowledge-distillation-based FL algorithms like FedED, Fedmd, or FedDF?

FedED: Federated Learning via Ensemble Distillation for Medical Relation Extraction,
Fedmd: Heterogenous federated learning via model distillation. NeurIPS workshop, 2019
FedDF, Ensemble distillation for robust model fusion in federated learning. NeurIPS, 2020

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant