Skip to content

Commit

Permalink
Add code examples for running different searchers on CIFAR10 dataset (#…
Browse files Browse the repository at this point in the history
  • Loading branch information
chengchengeasy authored and haifeng-jin committed Feb 5, 2019
1 parent c1e1bca commit 85d972a
Show file tree
Hide file tree
Showing 4 changed files with 72 additions and 6 deletions.
5 changes: 2 additions & 3 deletions autokeras/search.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,15 +8,15 @@
import torch.multiprocessing as mp


from abc import abstractmethod
from abc import ABC, abstractmethod
from datetime import datetime
from autokeras.bayesian import BayesianOptimizer
from autokeras.constant import Constant
from autokeras.nn.model_trainer import ModelTrainer
from autokeras.utils import pickle_to_file, pickle_from_file, verbose_print, get_system


class Searcher:
class Searcher(ABC):
"""The base class to search for neural architectures.
This class generate new architectures, call the trainer to train it, and update the optimizer.
Expand Down Expand Up @@ -234,7 +234,6 @@ def generate(self, multiprocessing_queue):
"""
pass

@abstractmethod
def update(self, *args):
pass

Expand Down
60 changes: 60 additions & 0 deletions examples/nas/cifar10_tutorial.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
"""
Run NAS baseline methods
========================
We provide 4 NAS baseline methods now, the default one is bayesian optimization.
Here is a tutorial about running NAS baseline methods.
Generally, to run a non-default NAS methods, we will do the following steps in order:
1. Prepare the dataset in the form of torch.utils.data.DataLoader.
2. Initialize the CnnModule/MlpModule with the class name of the NAS Searcher.
3. Start search by running fit function.
Refer the cifar10 example below for more details.
"""
import numpy as np
import torch
import torchvision
import torchvision.transforms as transforms
from torch.nn.functional import cross_entropy

from autokeras import CnnModule
from autokeras.nn.metric import Accuracy
from nas.greedy import GreedySearcher

if __name__ == '__main__':
print('==> Preparing data..')
transform_train = transforms.Compose([
transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])

transform_test = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010)),
])

trainset = torchvision.datasets.CIFAR10(root='./data', train=True,
download=True, transform=transform_train)
trainloader = torch.utils.data.DataLoader(trainset, batch_size=4,
shuffle=True, num_workers=2)

testset = torchvision.datasets.CIFAR10(root='./data', train=False,
download=True, transform=transform_test)
testloader = torch.utils.data.DataLoader(testset, batch_size=4,
shuffle=False, num_workers=2)
(image, target) = trainset[0]
image = np.array(image).transpose((1, 2, 0))
# add dim for batch
input_shape = np.expand_dims(image, axis=0).shape
num_classes = 10

# take GreedySearcher as an example, you can implement your own searcher and
# pass the class name to the CnnModule by search_type=YOUR_SEARCHER.
cnnModule = CnnModule(loss=cross_entropy, metric=Accuracy,
searcher_args={}, verbose=True,
search_type=GreedySearcher)

cnnModule.fit(n_output_node=num_classes, input_shape=input_shape,
train_data=trainloader,
test_data=testloader)
13 changes: 10 additions & 3 deletions nas/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,15 +26,22 @@ which may conflict with each other.

## Baseline methods implemented

Description of each baseline method.
We have implemented four NAS baseline methods:
1. random search: we explore the search space via morphing the network architectures randomly, so the actual performance of the generated nerual architecture has no effect on later search.
2. grid search: we manually specified subset of the hyperparameter space to search, i.e., the number of layers and the width of the layers are predefined.
3. greed search: we explore the search space in a greedy way. The "greedy" here means the base architecture for the next iteration of search is chosen from those generated by current iteration, the one that have best performance on the training/validation set in our implementation.
4. bayesian optimization: it's the default search strategy of Auto-Keras currently. refer [arXiv:1806.10282](https://arxiv.org/abs/1806.10282) for more detail.

## How to run the baseline methods?

Code example containing two parts: data preparation, search.
Should call CnnModule.
Refer to examples/nas/cifar10_tutorial.py for more details.


## How to implement your own search?
To implement your own NAS searcher, you need to implement your own searcher class YOUR_SEARCHER, which is derived
from base [Searcher](https://github.com/jhfjhfj1/autokeras/blob/master/autokeras/search.py) class. For your
YOUR_SEARCHER class, you must provide implementation of the abstract method: [generate(self, multiprocessing_queue)](https://github.com/jhfjhfj1/autokeras/blob/d6bea7369186df842dfb8ea3ed779cbd1b8f7c40/autokeras/search.py#L223),
which is invoked to generate the next neural architecture.

You are welcome to implement your own method for NAS in our framework.
If it works well, we are happy to merge it into our repo.
Expand Down
Empty file added nas/__init__.py
Empty file.

0 comments on commit 85d972a

Please sign in to comment.