This repo provides the implementation of the Efficient Graph Convolution (EGC) layer using PyTorch Geometric. We include hyperparameters and pretrained models (even for baselines which we trained!).
hyperparameters.mdcontains hyperparameters and experiment details.experiments/layers.pycontains the layer definitions for EGC, and is likely what you are most interested in. There's also an optimized implementation that was upstreamed to PyTorch Geometric inexperiments/optimized_layers.py.- The
experimentsdirectory contains subdirectories for each experiment dataset. - The code is structured using my experiment tuning library (exptune) which enables fast hyperparameter search + training of final models + plotting and so on. This is bundled under the
third_partydirectory as a git submodule. Beyond these dependencies, we use PyTorch Geometric and OGB. - To retrain the main table + ablation studies, you can use
train_main_table.shandtrain_ablation.sh. - To run the pre-trained models, use
run_pretrained.sh.
Clone the repo with the submodules (--recurse-submodules) and build the Dockerfile:
# Clone
git clone https://github.com/shyam196/egc.git --recurse-submodules
# Build
docker build . -t camlsys/egc
# Run
docker run -it --gpus all camlsys/egc bashThe script run_pretrained.sh provides all the commands you need to download the models and perform a run over the test set.
We provide two scripts to generate the results for the main table, and for the ablation studies.
This is discussed in more depth in hyperparameters.md.
@inproceedings{
tailor2022egc,
title={Do We Need Anistropic Graph Neural Networks?},
author={Shyam A. Tailor and Felix Opolka and Pietro Lio and Nicholas Donald Lane},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=hl9ePdHO4_s}
}
