Skip to content

sunsmarterjie/DAAS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Discretization-Aware Architecture Search

DAAS has been accepted by PR(2021), arxiv verision is here.

Weight-sharing methods determine sub-networks by discretization, i.e. by pruning off weak candidates, and the discretization process incurs signficant inaccuracy. We propose discretization-aware architecture search to alleviate this issue. The main idea is to introduce an additional term to the loss function, so that the architectural parameters of the super-network is gradually pushed towards the desired configuration during the search process.

DA2S

Figure 1: Pipeline of DA2S

The algorithm is based on continuous relaxation and gradient descent in the architecture space. Only a single GPU is required.

Requirements

Python == 3.6, PyTorch == 0.4

Datasets

CIFAR-10 and Imagenet.

Architecture search

To carry out architecture search using 2nd-order approximation, run

cd cnn && python train_search.py  

progress_convolutional_normal

DA2S: Change of softmax of operation weights α during the searching procedure in a normal cell on CIFAR10.

progress_convolutional_reduce

DA2S: Change of softmax of edge weights β of node3/4/5 during the searching procedure in a normal cell searched on CIFAR10.

progress_recurrent

DARTS: Change of softmax of operation weights α during the searching procedure in a normal cell on CIFAR10.

Architecture evaluation (using full-sized models)

To evaluate our best cells by training from scratch, run

cd cnn && python train.py --auxiliary --cutout            # CIFAR-10
cd cnn && python train_imagenet.py --auxiliary            # ImageNet

Customized architectures are supported through the --arch flag once specified in genotypes.py.

About

'Discretization-Aware Architecture Search' alleviates the discretization gap in one-shot differentiable NAS. DAAS has been accepted by PR (2021).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages