Skip to content

Commit

Permalink
fix bugs and update README
Browse files Browse the repository at this point in the history
  • Loading branch information
tanglang96 committed May 21, 2019
1 parent fbffa45 commit 7beccb8
Show file tree
Hide file tree
Showing 13 changed files with 33 additions and 505 deletions.
1 change: 1 addition & 0 deletions .gitignore
@@ -1,4 +1,5 @@
.idea
.git
__pycache__
run_all.sh
get_time_all.sh
Expand Down
8 changes: 5 additions & 3 deletions README.md
@@ -1,10 +1,12 @@
# Multinomial Distribution Learning for Effective Neural Architecture Search

Here we propose a method to extremely accelerate NAS, **without reinforcement learning or gradient**, just by sampling architectures from a distribution and comparing these architectures, Iteratively updating parameters of distribution while training
Here we propose a method to **extremely accelerate** NAS, **without reinforcement learning or gradient**, just by sampling architectures from a distribution and comparing these architectures, estimating their **relative performance** rather than absolute performance, iteratively updating parameters of the distribution while training.

![](figs/1.png)

Here we provide our test codes and pretrained model, our code is based on [DARTS](<https://github.com/khanrc/pt.darts>) and [ProxylessNAS](<https://github.com/mit-han-lab/ProxylessNAS>), pretrained models can be downloaded [here](https://drive.google.com/open?id=1W0UqwAnm37uibTuPDrH5Mt8PKNvFdD3v)
Here we provide our test codes and pretrained models, our code is based on [DARTS](<https://github.com/khanrc/pt.darts>) and [ProxylessNAS](<https://github.com/mit-han-lab/ProxylessNAS>), pretrained models can be downloaded [here](https://drive.google.com/open?id=1W0UqwAnm37uibTuPDrH5Mt8PKNvFdD3v)

**Search codes** will be released by [Sherwood](https://github.com/zhengxiawu) later !

## Requirements

Expand All @@ -13,7 +15,7 @@ Here we provide our test codes and pretrained model, our code is based on [DARTS

## Evaluate

You need to modified your path to dataset in ``` data_providers/cifar10.py``` and ```data_providers/imagenet.py```
You need to modified your path to dataset in ``` data_providers/cifar10.py``` and ```data_providers/imagenet.py``````config.sh``` is used to prepare your environment, you should **write this file by yourself** and here we use it to prepare dataset and packages

To evaluate the model in **DARTS setting**, just run

Expand Down
2 changes: 1 addition & 1 deletion models/darts_nets_cifar/augment_cells.py
@@ -1,8 +1,8 @@
""" CNN cell for network augmentation """
import torch
import torch.nn as nn
from models.darts_nets_cifar import ops
from utils import *
from models.darts_nets_cifar import ops

def to_dag(C_in, gene, reduction):
""" generate discrete ops from gene """
Expand Down
54 changes: 0 additions & 54 deletions models/darts_nets_cifar/search_cells.py

This file was deleted.

131 changes: 0 additions & 131 deletions models/darts_nets_cifar/search_cnn.py

This file was deleted.

21 changes: 20 additions & 1 deletion models/darts_nets_imagenet/augment_cells.py
@@ -1,9 +1,28 @@
""" CNN cell for network augmentation """
import torch
import torch.nn as nn
from models.darts_nets_imagenet import ops
from utils import *
from models.darts_nets_imagenet import ops

def to_dag(C_in, gene, reduction):
""" generate discrete ops from gene """
dag = nn.ModuleList()
for edges in gene:
row = nn.ModuleList()
for op_name, s_idx in edges:
# reduction cell & from input nodes => stride = 2
stride = 2 if reduction and s_idx < 2 else 1
op = ops.OPS[op_name](C_in, stride, True)
if not isinstance(op, ops.Identity): # Identity does not use drop path
op = nn.Sequential(
op,
ops.DropPath_()
)
op.s_idx = s_idx
row.append(op)
dag.append(row)

return dag

class AugmentCell(nn.Module):
""" Cell for augmentation
Expand Down
20 changes: 0 additions & 20 deletions models/darts_nets_imagenet/ops.py
Expand Up @@ -186,24 +186,4 @@ def forward(self, x):
return out


class MixedOp(nn.Module):
""" Mixed operation """

def __init__(self, C, stride):
super().__init__()
self._ops = nn.ModuleList()
for primitive in PRIMITIVES:
# 利用索引来使用函数可以使用字典,键值分别为索引和lambda表达式
op = OPS[primitive](C, stride, affine=False)
self._ops.append(op)

def forward(self, x, weights):
"""
Args:
x: input
weights: weight for each operation
"""
return sum(w * op(x) for w, op in zip(weights, self._ops))
# index = torch.multinomial(weights, 1)
# sum = self._ops[index](x)
# return sum
54 changes: 0 additions & 54 deletions models/darts_nets_imagenet/search_cells.py

This file was deleted.

0 comments on commit 7beccb8

Please sign in to comment.