Skip to content
This repository has been archived by the owner on Sep 18, 2024. It is now read-only.

Commit

Permalink
Add doc about frameworks and libraries (#1407)
Browse files Browse the repository at this point in the history
Add doc about frameworks and libraries
  • Loading branch information
rabbit008 authored and suiguoxin committed Aug 14, 2019
1 parent 8ca3dd1 commit 7b75539
Show file tree
Hide file tree
Showing 2 changed files with 50 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/en_US/CommunitySharings/HpoComparision.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Hyperparameter Optimization Comparison
# Hyper Parameter Optimization Comparison
*Posted by Anonymous Author*

Comparison of Hyperparameter Optimization algorithms on several problems.
Comparison of Hyperparameter Optimization (HPO) algorithms on several problems.

Hyperparameter Optimization algorithms are list below:

Expand Down
48 changes: 48 additions & 0 deletions docs/en_US/SupportedFramework_Library.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# Framework and Library Supports
With the built-in Python API, NNI naturally supports the hyper parameter tuning and neural network search for all the AI frameworks and libraries who support Python models(`version >= 3.5`). NNI had also provided a set of examples and tutorials for some of the popular scenarios to make jump start easier.

## Supported AI Frameworks

* <b>[PyTorch]</b> https://github.com/pytorch/pytorch
<ul>
<li><a href="../../examples/trials/mnist-distributed-pytorch">MNIST-pytorch</a><br/></li>
<li><a href="TrialExample/Cifar10Examples.md">CIFAR-10</a><br/></li>
<li><a href="../../examples/trials/kaggle-tgs-salt/README.md">TGS salt identification chanllenge</a><br/></li>
<li><a href="../../examples/trials/network_morphism/README.md">Network_morphism</a><br/></li>
</ul>
* <b>[TensorFlow]</b> https://github.com/tensorflow/tensorflow
<ul>
<li><a href="../../examples/trials/mnist-distributed">MNIST-tensorflow</a><br/></li>
<li><a href="../../examples/trials/ga_squad/README.md">Squad</a><br/></li>
</ul>
* <b>[Keras]</b> https://github.com/keras-team/keras
<ul>
<li><a href="../../examples/trials/mnist-keras">MNIST-keras</a><br/></li>
<li><a href="../../examples/trials/network_morphism/README.md">Network_morphism</a><br/></li>
</ul>
* <b>[MXNet]</b> https://github.com/apache/incubator-mxnet
* <b>[Caffe2]</b> https://github.com/BVLC/caffe
* <b>[CNTK (Python language)]</b> https://github.com/microsoft/CNTK
* <b>[Spark MLlib]</b> http://spark.apache.org/mllib/
* <b>[Chainer]</b> https://chainer.org/
* <b>[Theano]</b> https://pypi.org/project/Theano/ <br/>

You are encouraged to [contribute more examples](Tutorial/Contributing.md) for other NNI users.

## Supported Library
NNI also supports all libraries written in python.Here are some common libraries, including some algorithms based on GBDT: XGBoost, CatBoost and lightGBM.
* <b>[Scikit-learn]</b> https://scikit-learn.org/stable/
<ul>
<li><a href="TrialExample/SklearnExamples.md">Scikit-learn</a><br/></li>
</ul>
* <b>[XGBoost]</b> https://xgboost.readthedocs.io/en/latest/
* <b>[CatBoost]</b> https://catboost.ai/
* <b>[LightGBM]</b> https://lightgbm.readthedocs.io/en/latest/
<ul>
<li><a href="TrialExample/GbdtExample.md">Auto-gbdt</a><br/></li>
</ul>
Here is just a small list of libraries that supported by NNI. If you are interested in NNI, you can refer to the [tutorial](TrialExample/Trials.md) to complete your own hacks.



In addition to the above examples, we also welcome more and more users to apply NNI to your own work, if you have any doubts, please refer [Write a Trial Run on NNI](TrialExample/Trials.md). In particular, if you want to be a contributor of NNI, whether it is the sharing of examples , writing of Tuner or otherwise, we are all looking forward to your participation.More information please refer to [here](Tutorial/Contributing.md).

0 comments on commit 7b75539

Please sign in to comment.