Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Extend documentation #23

Merged
merged 2 commits into from Dec 1, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
3 changes: 2 additions & 1 deletion _doc/sphinxdoc/source/onnxmd/index.rst
Expand Up @@ -8,6 +8,7 @@ The documentation can be searched.

.. toctree::
:maxdepth: 1

index_onnx
index_onnxruntime
onnxruntime_python/index
18 changes: 8 additions & 10 deletions _doc/sphinxdoc/source/onnxmd/index_onnx.rst
@@ -1,17 +1,16 @@


ONNX documentation rendered with Sphinx
=======================================

.. contents::
:local:

Overwiew
++++++++

.. toctree::
:maxdepth: 1

onnx_docs/Overview.md
onnx_docs/IR.md
onnx_docs/PythonAPIOverview.md
Expand All @@ -22,7 +21,7 @@ Overwiew
onnx_docs/Hub.md
onnx_metadata
onnx_docs/ShapeInference.md

Syntax
++++++

Expand All @@ -36,7 +35,7 @@ Versions

.. toctree::
:maxdepth: 1

onnx_docs/Versioning.md
onnx_docs/VersionConverter.md
onnx_docs/Relicensing.md
Expand All @@ -47,20 +46,20 @@ Operators

.. toctree::
:maxdepth: 1

onnx_operators
onnx_operators_ml
onnx_changelog
onnx_changelog_ml
onnx_test_coverage
onnx_test_coverage_ml

Contribute
++++++++++

.. toctree::
:maxdepth: 1

onnx_contributing
onnx_add_new_op
onnx_docs/ImplementingAnOnnxBackend.md
Expand All @@ -76,6 +75,5 @@ Training

.. toctree::
:maxdepth: 1

onnx_docs/DefineDifferentiability.md

23 changes: 11 additions & 12 deletions _doc/sphinxdoc/source/onnxmd/index_onnxruntime.rst
@@ -1,5 +1,4 @@


onnxruntime markdown documentation rendered with Sphinx
=======================================================

Expand All @@ -11,59 +10,59 @@ The following pages renders the `markdown documentation

.. contents::
:local:

Overwiew
++++++++

.. toctree::
:maxdepth: 1

onnxruntime_docs/Roadmap.md
onnxruntime_docs/Privacy.md
onnxruntime_docs/Server.md
onnxruntime_docs/ONNX_Runtime_Server_Usage.md
onnxruntime_docs/FAQ.md
onnxruntime_docs/OperatorKernels.md

Versions
++++++++

.. toctree::
:maxdepth: 1

onnxruntime_docs/Versioning.md

Contributing
++++++++++++

.. toctree::
:maxdepth: 1

onnxruntime_docs/Coding_Conventions_and_Standards.md
onnxruntime_docs/ABI_Dev_Notes.md
onnxruntime_docs/PR_Guidelines.md
onnxruntime_docs/Model_Test.md
onnxruntime_docs/NotesOnThreading.md
onnxruntime_docs/Python_Dev_Notes.md

C API
+++++

.. toctree::
:maxdepth: 1

onnxruntime_docs/How_To_Update_ONNX_Dev_Notes.md
onnxruntime_docs/C_API_Guidelines.md
onnxruntime_docs/cmake_guideline.md
onnxruntime_docs/onnxruntime_extensions.md
onnxruntime_docs/ContribOperators.md

Others
++++++

.. toctree::
:maxdepth: 1

onnxruntime_docs/Android_testing.md
onnxruntime_docs/ORTMobilePackageOperatorTypeSupport.md
onnxruntime_docs/WinML_principles.md
Expand Down
3 changes: 1 addition & 2 deletions _doc/sphinxdoc/source/onnxmd/onnx_add_new_op.rst
@@ -1,9 +1,8 @@


Adding a new operator
=====================

.. toctree::
:maxdepth: 1

onnx_docs/AddNewOp.md
3 changes: 1 addition & 2 deletions _doc/sphinxdoc/source/onnxmd/onnx_changelog.rst
@@ -1,9 +1,8 @@


Change Logs
===========

.. toctree::
:maxdepth: 1

onnx_docs/Changelog.md
3 changes: 1 addition & 2 deletions _doc/sphinxdoc/source/onnxmd/onnx_changelog_ml.rst
@@ -1,9 +1,8 @@


ML Change Logs
==============

.. toctree::
:maxdepth: 1

onnx_docs/Changelog-ml.md
3 changes: 1 addition & 2 deletions _doc/sphinxdoc/source/onnxmd/onnx_contributing.rst
@@ -1,9 +1,8 @@


Contributing
============

.. toctree::
:maxdepth: 1

onnx_docs/CONTRIBUTING.md
1 change: 0 additions & 1 deletion _doc/sphinxdoc/source/onnxmd/onnx_docs/Broadcasting.md
Expand Up @@ -8,7 +8,6 @@ ONNX supports two types of broadcasting: multidirectional broadcasting and
unidirectional broadcasting. We will introduce these two types of broadcasting
respectively in the following sections.


## Multidirectional Broadcasting

In ONNX, a set of tensors are multidirectional broadcastable to the same shape
Expand Down
1 change: 0 additions & 1 deletion _doc/sphinxdoc/source/onnxmd/onnx_docs/CONTRIBUTING.md
Expand Up @@ -90,7 +90,6 @@ After having installed mypy, you can run the type checks:
python setup.py typecheck
```


# Other developer documentation

* [How to implement ONNX backend (ONNX to something converter)](ImplementingAnOnnxBackend.md)
Expand Down
2 changes: 0 additions & 2 deletions _doc/sphinxdoc/source/onnxmd/onnx_docs/Changelog-ml.md
Expand Up @@ -186,7 +186,6 @@ This version of the operator has been available since version 1 of the 'ai.onnx.
Any keys not present in the input dictionary, will be zero in the output array.<br>
For example: if the ``string_vocabulary`` parameter is set to ``["a", "c", "b", "z"]``,
then an input of ``{"a": 4, "c": 8}`` will produce an output of ``[4, 8, 0, 0]``.


#### Version

Expand Down Expand Up @@ -968,4 +967,3 @@ This version of the operator has been available since version 2 of the 'ai.onnx.
<dt><tt>T2</tt> : tensor(string), tensor(int64), tensor(float)</dt>
<dd>Output type is determined by the specified 'values_*' attribute.</dd>
</dl>

51 changes: 24 additions & 27 deletions _doc/sphinxdoc/source/onnxmd/onnx_docs/Hub.md
Expand Up @@ -2,21 +2,21 @@

# ONNX Model Hub

The ONNX Model Hub is a simple and fast way to get started with state of the art pre-trained
ONNX models from the [ONNX Model Zoo](https://github.com/onnx/models). Furthermore, this allows researchers and model
The ONNX Model Hub is a simple and fast way to get started with state of the art pre-trained
ONNX models from the [ONNX Model Zoo](https://github.com/onnx/models). Furthermore, this allows researchers and model
developers the opportunity to share their pre-trained models with the broader community.

## Install
The ONNX Model hub will be included in the `onnx` package from version 1.11 onwards.
The ONNX Model hub will be included in the `onnx` package from version 1.11 onwards.
To use the hub before the 1.11 release please install from the weekly build:

```shell script
pip install -i https://test.pypi.org/simple/ onnx-weekly
pip install -i https://test.pypi.org/simple/ onnx-weekly
```

## Basic usage
The ONNX Model Hub is capable of downloading, listing, and querying trained models from any git repository,
and defaults to the official [ONNX Model Zoo](https://github.com/onnx/models). In this section we demonstrate some of the basic functionality.
and defaults to the official [ONNX Model Zoo](https://github.com/onnx/models). In this section we demonstrate some of the basic functionality.

First please import the hub using:
```python
Expand All @@ -26,14 +26,13 @@ from onnx import hub
#### Downloading a model by name:

The `load` function will default to searching the model zoo for the latest model with a matching name,
download this model to a local cache, and load the model into a `ModelProto`
download this model to a local cache, and load the model into a `ModelProto`
object for use with the ONNX runtime.

```python
model = hub.load("resnet50")
```


#### Downloading from custom repositories:

Any repository with the proper structure can be a ONNX model hub. To download from other hubs,
Expand All @@ -47,7 +46,7 @@ model = hub.load("resnet50", repo='onnx/models:771185265efbdc049fb223bd68ab1aeb1

The model hub provides APIs for querying the model zoo to learn more about available models.
This does not download the models, but rather just returns information about models matching the given arguments

```python
# List all models in the onnx/models:master repo
all_models = hub.list_models()
Expand Down Expand Up @@ -83,10 +82,9 @@ ModelInfo(
)
```


## Local Caching

The ONNX Model hub locally caches downloaded models in a configurable location
The ONNX Model hub locally caches downloaded models in a configurable location
so that subsequent calls to `hub.load` do not require network connection.

#### Default cache location
Expand All @@ -111,32 +109,32 @@ print(hub.get_dir())

#### Additional cache details

To clear the model cache one can simply delete the cache directory using a python utility like `shutil` or `os`.
To clear the model cache one can simply delete the cache directory using a python utility like `shutil` or `os`.
Furthermore one can choose to override the cached model using the `force_reload` option:

```python
model = hub.load("resnet50", force_reload=True)
```

We include this flag for completeness but note that models in the cache are disambiguated with sha256 hashes so
We include this flag for completeness but note that models in the cache are disambiguated with sha256 hashes so
the force_reload flag is not necessary for normal use.
Finally we note that the model cache directory structure will mirror the directory structure
specified by the `model_path` field of the manifest, but with file names disambiguated with model SHA256 Hashes.
Finally we note that the model cache directory structure will mirror the directory structure
specified by the `model_path` field of the manifest, but with file names disambiguated with model SHA256 Hashes.

This way, the model cache is human readable, can disambiguate between multiple versions of models,
and can re-use cached models across different hubs if they have the same name and hash.
and can re-use cached models across different hubs if they have the same name and hash.

## Architecture

![ONNX Hub Architecture](images/onnx_hub_arch.svg)

The ONNX Hub consists of two main components, the client and the server.
The client code currently is included in the `onnx` package and can be pointed at a
server in the form of a hosted `ONNX_HUB_MANIFEST.json` within a github repository
server in the form of a hosted `ONNX_HUB_MANIFEST.json` within a github repository
such as [the one in the ONNX Model Zoo](https://github.com/onnx/models/blob/master/ONNX_HUB_MANIFEST.json).
This manifest file is a JSON document which lists all models and their metadata
and is designed to be programming language agnostic. An example of a well formed model manifest entry is as follows:

```json
{
"model": "BERT-Squad",
Expand Down Expand Up @@ -197,7 +195,7 @@ The ONNX Hub consists of two main components, the client and the server.
These important fields are:

- `model`: The name of the model used for querying
- `model_path`: The relative path of the model stored in Git LFS.
- `model_path`: The relative path of the model stored in Git LFS.
- `onnx_version`: The ONNX version of the model
- `opset_version`: The version of the opset. The client downloads the latest opset if left unspecified.
- `metadata/model_sha`: Optional model sha specification for increased download security
Expand All @@ -207,15 +205,14 @@ All other fields in the `metadata` field are optional for the client but provide

## Adding to the ONNX Model Hub


#### Contributing an official model

The simplest way to add a model to the official `onnx/models` version model hub is to follow
The simplest way to add a model to the official `onnx/models` version model hub is to follow
[these guidelines](https://github.com/onnx/models/blob/master/contribute.md) to contribute your model. Once contributed,
ensure that your model has a markdown table in its `README.md`
ensure that your model has a markdown table in its `README.md`
([Example](https://github.com/onnx/models/tree/master/vision/classification/mobilenet)). The model hub
manifest generator will pull information from these markdown tables. To run the generator:

```shell script
git clone https://github.com/onnx/models.git
git lfs pull --include="*" --exclude=""
Expand All @@ -228,8 +225,8 @@ Once a new manifest is generated add, submit it in a pull request to ``onnx/mode
#### Hosting your own ONNX Model Hub

To host your own model hub, add an `ONNX_HUB_MANIFEST.json` to the top level of your github repository
([Example](https://github.com/onnx/models/blob/master/ONNX_HUB_MANIFEST.json)). At a minimum your
manifest entries should include the fields mentioned in
([Example](https://github.com/onnx/models/blob/master/ONNX_HUB_MANIFEST.json)). At a minimum your
manifest entries should include the fields mentioned in
the [Architecture Section](Hub.md#Architecture) of this document.
Once committed, check that you can download models
using the "Downloading from custom repositories" section of this doc.
Once committed, check that you can download models
using the "Downloading from custom repositories" section of this doc.