Skip to content
/ FedTAD Public

Official code repository of the paper "FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning" in the proceedings of International Joint Conference on Artificial Intelligence (IJCAI) 2024.

Notifications You must be signed in to change notification settings

zyl24/FedTAD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning

Official code repository of the paper "FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning" in the proceedings of International Joint Conference on Artificial Intelligence (IJCAI) 2024.

arXiv

Requirements

Hardware environment: Intel(R) Xeon(R) Gold 6230R CPU @ 2.10GHz, NVIDIA GeForce RTX 3090 with 24GB memory.

Software environment: Ubuntu 18.04.6, Python 3.9, PyTorch 1.11.0 and CUDA 11.8.

Please refer to PyTorch and PyG to install the environments;

Training

Here we take Cora-Louvain-10 Clients as an example:

python train_fedtad.py --dataset Cora --num_clients 10 --partition Louvain

Cite Us

Please cite our paper if you utilize this code in your research:

@misc{zhu2024fedtad,
      title={FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning}, 
      author={Yinlin Zhu and Xunkai Li and Zhengyu Wu and Di Wu and Miao Hu and Rong-Hua Li},
      year={2024},
      eprint={2404.14061},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

Official code repository of the paper "FedTAD: Topology-aware Data-free Knowledge Distillation for Subgraph Federated Learning" in the proceedings of International Joint Conference on Artificial Intelligence (IJCAI) 2024.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages