Skip to content

yuziGuo/FarOptBasis

Repository files navigation

Graph Neural Networks with Learnable and Optimal Polynomial Bases

[Under construction].

[Paper]. This work is accepted to ICML'23!

This repository includes the implementation for FavardGNN and OptBasisGNN, two spectral graph neural networks which adapts the polynomial bases for filtering.

If you have any question about our methodology or this repository, please contact me or raise an issue.

Table of contents

  1. Requirements
  2. Reproducing Classification Results
    1. On Geom-GCN datasets and Citation datasets
    2. On LINKX datasets
  3. Reproducing Regression Results
    1. Preparations
    2. Run experiments
  4. Hyperparams Tuning Scripts

Requirements

[Under Construction]

Reproducing Classification Results.

Folder structure.

Before running the experiments, the folder structure is as below:

.
├── cache
│   └── ckpts 
├── data
│   ├── linkx # Code From LINKX repo
├── datasets
│   ├── geom_data
│   ├── linkx
│   └── Planetoid
├── layers
├── models
├── runs 
│   └── placeholder.txt
└── utils

Reproducing Results on Geom-GCN Datasets (Tbl.1).

Run scripts in the following files under ./ path.

sh> sh scripts/reproduce_favardgnn.sh
sh> sh scripts/reproduce_optbasis.sh

Table 1

Reproducing Results on LINKX Datasets (Tbl.2).

Run scripts in the following files under ./ path.

sh> sh scripts/reproduce_linkx.sh

Table 2

Reproducing Regression Task.

Shift working path to Regression/.

sh> cd Regression

Preparations.

Step 1: Prepare images

sh> unzip -d BernNetImages  BernNet-LearningFilters-image.zip

Step 2: Pre-compute $U h(\Lambda) U^T$

Pre-compute the matrix polynomials $M = U h(\Lambda) U^T = h(L)$ where $L$ is the Laplacian matrix for 100x100 grid graph, and $h$ corresponds to

  • High-pass filter;
  • Low-pass filter;
  • Band-pass filter;
  • Band-reject filter.
sh> python preprocess_matrix_polynomials.py

The result of this step is saved in the save/ folder. This step would take several hours, you can also download our pre-computed matrices from this google drive url, and unzip them directly.

sh > mkdir save
sh > # Download cachedMatrices.zip and put it under ./Regresion/save/
sh> unzip save/cachedMatrices.zip -d ./save/
sh> rm ./save/cachedMatrices.zip

The resulted files are:

.save/
├── bandpass_Np=100.pkl
├── bandreject_Np=100.pkl
├── highpass_Np=100.pkl
└── lowpass_Np=100.pkl

Step 3: Make dataset.

sh> python make_dataset.py

The result of this step is a pickle file MultiChannelFilterDataset.pkl.

Run experiments.

Now we run the regression task! At this moment, the folder structure (ignoring python files) is:

./Regression/
├── BernNet-LearningFilters-image.zip
├── MultiChannelFilterDataset.pkl
└── save
    ├── bandpass_Np=100.pkl
    ├── bandreject_Np=100.pkl
    ├── highpass_Np=100.pkl
    └── lowpass_Np=100.pkl

To reproduce Table 5, you can use the bash script below to run over all the samples.

sh> python main_all.py

Table 5

To reproduce converging curves as in Figure 2,

Figure 2

you can use the following script to run one or several samples and record the losses.

sh> python main_sample.py

Hyperparams-tuning scripts using Optuna

[Under Construction]

If you want to test FavardGNN or OptBasisGNN on other datasets, you might need the Optuna script for hyperparameter tuning. Contact me at guoyuhe[at]ruc[dot]edu[dot]cn.

Related Repos

[Under Construction]

Cite our work

Please cite us if our work or repo inspire you.

@inproceedings{DBLP:conf/icml/GuoW23,
  author       = {Yuhe Guo and
                  Zhewei Wei},
  editor       = {Andreas Krause and
                  Emma Brunskill and
                  Kyunghyun Cho and
                  Barbara Engelhardt and
                  Sivan Sabato and
                  Jonathan Scarlett},
  title        = {Graph Neural Networks with Learnable and Optimal Polynomial Bases},
  booktitle    = {International Conference on Machine Learning, {ICML} 2023, 23-29 July
                  2023, Honolulu, Hawaii, {USA}},
  series       = {Proceedings of Machine Learning Research},
  volume       = {202},
  pages        = {12077--12097},
  publisher    = {{PMLR}},
  year         = {2023},
  url          = {https://proceedings.mlr.press/v202/guo23i.html},
  timestamp    = {Wed, 16 Aug 2023 17:14:15 +0200},
  biburl       = {https://dblp.org/rec/conf/icml/GuoW23.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published