Skip to content

[CVPR2024] official code for "Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach"

beichenzbc/Supernet-shifting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Supernet-shifting

Introduction

This repository is the official implementation of Supernet Shifting

Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach   (accepted by CVPR2024)
Beichen Zhang, Xiaoxing Wang, Xiaohan Qin, Junchi Yan

Train From Scratch

1.Set Up Dataset and Prepare Flops Table

Download the ImageNet Dataset and move the images to labeled folders. Download the Flops table used in Flops calculation. It is proposed by SPOS and can be founded in Link The structure of dataset should be

data
|--- train                 ImageNet Training Dataset
|--- val                   ImageNet Validation Dataset
|--- op_flops_dict.pkl     Flops Table

2.Train Supernet

Train the supernet with the following command:

cd supernet
python3 train.py --train-dir $YOUR_TRAINDATASET_PATH --val-dir $YOUR_VALDATASET_PATH

3.Supernet Shifting and Architecture Seaching

First, change the data root in imagenet_dataset.py Apply supernet shifting and architecture searching in the following command

cd search
python3 search.py

If you want to transfer the supernet weight to a new dataset, first, change the data root and the dataloader in imagenet_dataset.py, then run the following comand

cd search
python3 search.py --new_dataset True --n_class $new_dataset_classes 

4.Get Searched Architecture

Get searched architecture with the following command:

cd evaluation
python3 eval.py

5. Train from Scratch

Finally, train and evaluate the searched architecture with the following command.

cd evaluation/data/$YOUR_ARCHITECTURE
python3 train.py --train-dir $YOUR_TRAINDATASET_PATH --val-dir $YOUR_VALDATASET_PATH

Citation

If you use these models in your research, please cite:

@article{zhang2024boosting,
        title={Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach},
        author={Beichen Zhang and Xiaoxing Wang and Xiaohan Qin and Junchi Yan},
        journal={arXiv preprint arXiv:2403.11380},
        year={2024}
}

About

[CVPR2024] official code for "Boosting Order-Preserving and Transferability for Neural Architecture Search: a Joint Architecture Refined Search and Fine-tuning Approach"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages