Code for Aggregation Buffer: Revisiting DropEdge with a New Parameter Block by Dooho Lee, Myeong Kong, Sagad Hamid, Cheonwoo Lee and Jaemin Yoo.
A Dockerfile is provided for containerized execution.
docker build -t gnn_buffer .
docker run --gpus all -v $(pwd):/app -it gnn_buffer bash
These commands launch a Docker container equipped with necessary dependencies. You can execute training and evaluation scripts within this container.
Most datasets are downloaded automatically. However, the filtered versions of the Chameleon and Squirrel datasets must be manually acquired: (1) Download filtered.npz
for each dataset from this repository. (2) Place the downloaded files in dataset/
at the project root.
We propose a two-step training scheme in our paper:
- Train a GNN without DropEdge using
train/train_gnn.py
. - Integrate AGGB into the pre-trained GNN layers and train only the AGGB parameters with DropEdge, keeping all other parameters frozen. This is done using
train/train_buffer.py
.
After executing train/train_gnn.py
, the trained GNN weights will be saved in the results/
directory. The script train/train_buffer.py
will then load these weights to integrate and train the AGGB module.
You can specify the dataset using the --dataset
argument. A detailed description of all configurable arguments is available in utils/arg_parser.py
. Hyperparameter configurations are primarily loaded from the corresponding JSON files in the configs/
directory based on the specified dataset. These configuration files take precedence over command-line arguments, so please ensure they are correctly set.
For reproducibility, example scripts used in our experiments are provided in the scripts/
directory:
./scripts/cora.sh
If you find our work useful, please cite the following:
@inproceedings{lee2025aggregation,
title={Aggregation Buffer: Revisiting DropEdge with a New Parameter Block},
author={Dooho Lee and Myeong Kong and Sagad Hamid and Cheonwoo Lee and Jaemin Yoo},
booktitle={Forty-second International Conference on Machine Learning},
year={2025},
url={https://openreview.net/forum?id=sUVOXjOglX}
}
For questions or feedback, please open an issue in this repository or contact dooho@kaist.ac.kr.