This repository containts code for the paper: "NAS-Bench-301 and the Case for Surrogate Benchmarks for Neural Architecture Search".
The surrogate models can be downloaded on figshare. This includes the models for v0.9 and v1.0 as well as the dataset that was used to train the surrogate models. We also provide the full training logs for all architectures, which include learning curves on the train, validation and test sets. These can
be automatically downloaded, please see nasbench301/example.py
.
To install all requirements (this may take a few minutes), run
$ cat requirements.txt | xargs -n 1 -L 1 pip install
$ pip install nasbench301
If installing directly from github
$ git clone https://github.com/automl/nasbench301
$ cd nasbench301
$ cat requirements.txt | xargs -n 1 -L 1 pip install
$ pip install .
To run the example
$ python3 nasbench301/example.py
To fit a surrogate model run
$ python3 fit_model.py --model gnn_gin --nasbench_data PATH_TO_NB_301_DATA_ROOT --data_config_path configs/data_configs/nb_301.json --log_dir LOG_DIR
To create the dataset used for the benchmark we trained the model using a version of AutoPyTorch which can be found here: https://github.com/automl/Auto-PyTorch/tree/nb301.
To train a model with its hyperparameters and architecture described using a configspace representation (.json
file), firstly download the CIFAR10 data as used by AutoPyTorch and extract it at nasbench301/data_generation/datasets
. Then run the following to start the training:
$ cd nasbench301/data_generation
$ python3 run_proxy.py --run_id 1 --config configs/config_0.json
The configuration file can be any other customly generated one. Check the file nasbench301/representations.py
to convert an object in the Genotype
representation from DARTS to a ConfigSpace
object.