Skip to content
/ NAC Public

This repo is the official implementation of "Do Not Train It: A Linear Neural Architecture Search of Graph Neural Networks" (Xu et al., ICML 2023)

Notifications You must be signed in to change notification settings

shipxu123/NAC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NAC Framework

This repo is the official implementation of "Do Not Train It: A Linear Neural Architecture Search of Graph Neural Networks" (Xu et al., ICML 2023)

Introduction

The originization of this repo is shown as follow:

|-- README.md # short introduction of codes
|-- nac # the python implementation of this project, including NAS Searching Phase and Finetuning Phase
        |-- __init__.py
        |-- controller # NAS updating modules
        |-- lr_scheduler
        |-- model
        |-- optimizer
        |-- solver
        `-- utils
|-- configs  # the typical usage configuration
`-- examples # the configuration for runing experiments

Enviroment

First, install the required enviromental setup.

conda create -n nac python=3.7
conda activate nac

# please change the cuda/device version as you need
pip install -r requirements.txt

Usage

Go to the workspace dir of examples,

  1. First, change the ROOT Path in scripts to the current dir of the repo:
ROOT=/mnt/home/pxu22/codes/NAC -> Current Path to Repo
export PYTHONPATH=$ROOT:$PYTHONPATH
  1. Run the train script first to get the searched result:
bash train.sh
  1. Go to the finetune dir, and get the finetuned result of specific architecture
cd Finetune;
bash finetune.sh;

Citation

@inproceedings{xu2023do,
  title={Do Not Train It: A Linear Neural Architecture Search of Graph Neural Networks},
  author={Peng Xu and Lin Zhang and Xuanzhou Liu and Jiaqi Sun and Yue Zhao and Haiqin Yang and Bei Yu},
  booktitle={International Conference on Machine Learning},
  year={2023}
}

About

This repo is the official implementation of "Do Not Train It: A Linear Neural Architecture Search of Graph Neural Networks" (Xu et al., ICML 2023)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published