Skip to content

Commit

Permalink
Merge pull request #114 from Microsoft/master
Browse files Browse the repository at this point in the history
merge master
  • Loading branch information
SparkSnail committed Jan 14, 2019
2 parents 3784355 + efa479b commit d91c980
Show file tree
Hide file tree
Showing 35 changed files with 564 additions and 437 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Expand Up @@ -67,3 +67,5 @@ typings/
__pycache__
build
*.egg-info

.vscode
1 change: 0 additions & 1 deletion README.md
Expand Up @@ -134,4 +134,3 @@ We are in construction of the instruction for [How to Debug](docs/HowToDebug.md)

## **License**
The entire codebase is under [MIT license](https://github.com/Microsoft/nni/blob/master/LICENSE)

49 changes: 31 additions & 18 deletions docs/GetStarted.md
Expand Up @@ -2,27 +2,35 @@
===

## **Installation**

* __Dependencies__

python >= 3.5
git
wget
```bash
python >= 3.5
git
wget
```

python pip should also be correctly installed. You could use "python3 -m pip -v" to check in Linux.

python pip should also be correctly installed. You could use "python3 -m pip -v" to check in Linux.

* Note: we don't support virtual environment in current releases.
* Note: we don't support virtual environment in current releases.

* __Install NNI through pip__

python3 -m pip install --user --upgrade nni
```bash
python3 -m pip install --user --upgrade nni
```

* __Install NNI through source code__

git clone -b v0.4.1 https://github.com/Microsoft/nni.git
cd nni
source install.sh

```bash
git clone -b v0.4.1 https://github.com/Microsoft/nni.git
cd nni
source install.sh
```

## **Quick start: run a customized experiment**

An experiment is to run multiple trial jobs, each trial job tries a configuration which includes a specific neural architecture (or model) and hyper-parameter values. To run an experiment through NNI, you should:

* Provide a runnable trial
Expand All @@ -32,22 +40,26 @@ An experiment is to run multiple trial jobs, each trial job tries a configuratio

**Prepare trial**: Let's use a simple trial example, e.g. mnist, provided by NNI. After you installed NNI, NNI examples have been put in ~/nni/examples, run `ls ~/nni/examples/trials` to see all the trial examples. You can simply execute the following command to run the NNI mnist example:

python3 ~/nni/examples/trials/mnist-annotation/mnist.py
```bash
python3 ~/nni/examples/trials/mnist-annotation/mnist.py
```

This command will be filled in the yaml configure file below. Please refer to [here](howto_1_WriteTrial.md) for how to write your own trial.

**Prepare tuner**: NNI supports several popular automl algorithms, including Random Search, Tree of Parzen Estimators (TPE), Evolution algorithm etc. Users can write their own tuner (refer to [here](howto_2_CustomizedTuner.md), but for simplicity, here we choose a tuner provided by NNI as below:

tuner:
builtinTunerName: TPE
classArgs:
optimize_mode: maximize
```yaml
tuner:
builtinTunerName: TPE
classArgs:
optimize_mode: maximize
```

*builtinTunerName* is used to specify a tuner in NNI, *classArgs* are the arguments pass to the tuner, *optimization_mode* is to indicate whether you want to maximize or minimize your trial's result.

**Prepare configure file**: Since you have already known which trial code you are going to run and which tuner you are going to use, it is time to prepare the yaml configure file. NNI provides a demo configure file for each trial example, `cat ~/nni/examples/trials/mnist-annotation/config.yml` to see it. Its content is basically shown below:

```
```yaml
authorName: your_name
experimentName: auto_mnist

Expand All @@ -73,7 +85,7 @@ trial:
command: python mnist.py
codeDir: ~/nni/examples/trials/mnist-annotation
gpuNum: 0
```
```

Here *useAnnotation* is true because this trial example uses our python annotation (refer to [here](../tools/annotation/README.md) for details). For trial, we should provide *trialCommand* which is the command to run the trial, provide *trialCodeDir* where the trial code is. The command will be executed in this directory. We should also provide how many GPUs a trial requires.

Expand All @@ -87,6 +99,7 @@ You can refer to [here](NNICTLDOC.md) for more usage guide of *nnictl* command l
The experiment has been running now, NNI provides WebUI for you to view experiment progress, to control your experiment, and some other appealing features. The WebUI is opened by default by `nnictl create`.

## Read more

* [Tuners supported in the latest NNI release](./HowToChooseTuner.md)
* [Overview](Overview.md)
* [Installation](Installation.md)
Expand Down
2 changes: 1 addition & 1 deletion docs/HowToDebug.md
@@ -1,4 +1,4 @@
**How to Debug in NNI**
===

*Coming soon*
*Coming soon*
28 changes: 17 additions & 11 deletions docs/Installation.md
Expand Up @@ -6,25 +6,31 @@ Currently we only support installation on Linux & Mac.
## **Installation**
* __Dependencies__

python >= 3.5
git
wget
```bash
python >= 3.5
git
wget
```

python pip should also be correctly installed. You could use "python3 -m pip -v" to check pip version.
python pip should also be correctly installed. You could use "python3 -m pip -v" to check pip version.

* __Install NNI through pip__

python3 -m pip install --user --upgrade nni
```bash
python3 -m pip install --user --upgrade nni
```

* __Install NNI through source code__

git clone -b v0.4.1 https://github.com/Microsoft/nni.git
cd nni
source install.sh

```bash
git clone -b v0.4.1 https://github.com/Microsoft/nni.git
cd nni
source install.sh
```

* __Install NNI in docker image__

You can also install NNI in a docker image. Please follow the instructions [here](../deployment/docker/README.md) to build NNI docker image. The NNI docker image can also be retrieved from Docker Hub through the command `docker pull msranni/nni:latest`.
You can also install NNI in a docker image. Please follow the instructions [here](../deployment/docker/README.md) to build NNI docker image. The NNI docker image can also be retrieved from Docker Hub through the command `docker pull msranni/nni:latest`.

## **System requirements**

Expand Down Expand Up @@ -52,8 +58,8 @@ Below are the minimum system requirements for NNI on macOS. Due to potential pro
|**Internet**|Boardband internet connection|
|**Resolution**|1024 x 768 minimum display resolution|


## Further reading

* [Overview](Overview.md)
* [Use command line tool nnictl](NNICTLDOC.md)
* [Use NNIBoard](WebUI.md)
Expand Down
3 changes: 1 addition & 2 deletions docs/KubeflowMode.md
Expand Up @@ -43,7 +43,7 @@ kubeflowConfig:
```
If users want to use tf-operator, he could set `ps` and `worker` in trial config. If users want to use pytorch-operator, he could set `master` and `worker` in trial config.

## Supported sotrage type
## Supported storage type
NNI support NFS and Azure Storage to store the code and output files, users could set storage type in config file and set the corresponding config.
The setting for NFS storage are as follows:
```
Expand Down Expand Up @@ -197,4 +197,3 @@ Notice: In kubeflow mode, NNIManager will start a rest server and listen on a po
Once a trial job is completed, you can goto NNI WebUI's overview page (like http://localhost:8080/oview) to check trial's information.

Any problems when using NNI in kubeflow mode, plesae create issues on [NNI github repo](https://github.com/Microsoft/nni), or send mail to nni@microsoft.com

0 comments on commit d91c980

Please sign in to comment.