Skip to content

chr26195/GKD

Repository files navigation

Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks

The official implementation for "Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks" which is accepted to NeurIPS22.

Abstract: We study a new paradigm of knowledge transfer that aims at encoding graph topological information into graph neural networks (GNNs) by distilling knowledge from a teacher GNN model trained on a complete graph to a student GNN model operating on a smaller or sparser graph. To this end, we revisit the connection between thermodynamics and the behavior of GNN, based on which we propose Neural Heat Kernel (NHK) to encapsulate the geometric property of the underlying manifold concerning the architecture of GNNs. A fundamental and principled solution is derived by aligning NHKs on teacher and student models, dubbed as Geometric Knowledge Distillation. We develop non- and parametric instantiations and demonstrate their efficacy in various experimental settings for knowledge distillation regarding different types of privileged topological information and teacher-student schemes.

Related materials: paper

Use the Code

  • Install the required package according to requirements.txt.
  • Specify your own data path in parse.py and download the datasets.
  • Pretrain teacher models, which will be saved in the folder /saved_models
python main.py --dataset cora --rand_split --use_bn --base_model gcn --mode pretrain --dist_mode no --save_model 
  • Train student models, e.g.,
python main.py --dataset cora --rand_split --use_bn --base_model gcn --mode train --priv_type edge --dist_mode gkd --kernel sigmoid

ACK

The pipeline for training and preprocessing is developed on basis of the Non-Homophilous Benchmark project.

About

The official implementation for "Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks" which is accepted to NeurIPS22.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages