Open Neural Network Exchange
Clone or download
yuslepukhin and gramalingam Remove Opaque type parameters as not needed. Adjust DataType handling. (
#1520)

* Remove Opaque type parameters as not needed. Adjust DataType handling.

* Change Opaque type DataType generation and parsing format.
  For empty domain/name - > opaque()
  For empty name -> opaque(domain,)
  For empty domain -> opaque(name)
  Properly deprecate Opaque parameters field.
Latest commit 0ead430 Oct 18, 2018
Permalink
Failed to load latest commit information.
.circleci Pin awscli to last known good version (#1518) Oct 17, 2018
.travis cmake (#1401) Sep 11, 2018
cmake fix ninja external (#1507) Oct 13, 2018
conda Merge release branch back to master (#512) Feb 7, 2018
docs Add MaxUnpool op to ONNX. (#1494) Oct 16, 2018
onnx Remove Opaque type parameters as not needed. Adjust DataType handling. ( Oct 18, 2018
stubs Fix .gitignore and add missing files (#1005) May 22, 2018
third_party Update Google benchmark to 1.4.1 (#1083) Jun 7, 2018
tools add the script (#1501) Oct 12, 2018
.clang-format Add clang-format rules file Nov 4, 2017
.gitignore Broadcast Version Conversion Adapters (Add, Mul, Gemm) (#1284) Aug 24, 2018
.gitmodules Microbenchmark for encoding+decoding ModelProto and GraphProto with a… Mar 15, 2018
.travis.yml Add support for building with protobuf-lite (#1326) Aug 28, 2018
CMakeLists.txt Skip some warning for clang-cl (#1484) Oct 10, 2018
LICENSE Bulk commit for releasing Open Neural Network Exchange (ONNX) Sep 7, 2017
MANIFEST.in ONNX v1.3.0 release (#1359) Aug 31, 2018
README.md Opset Version Converter (#1148) Jul 27, 2018
RELEASE-MANAGEMENT.md Spec clarity: Versioning (#931) May 16, 2018
VERSION_NUMBER ONNX v1.3.0 release (#1359) Aug 31, 2018
appveyor.yml Build with old version protobuf on Windows (#1486) Oct 9, 2018
pyproject.toml Try using pep518 to install the protobuf build dependency (#782) Apr 19, 2018
setup.cfg Enforce writing type hints. (#993) May 22, 2018
setup.py Implement function retrieval APIs; Add documentation for functions (#… Jul 19, 2018

README.md

Linux Windows
Build Status Build status

Open Neural Network Exchange (ONNX) is the first step toward an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Initially we focus on the capabilities needed for inferencing (evaluation).

Caffe2, PyTorch, Microsoft Cognitive Toolkit, Apache MXNet and other tools are developing ONNX support. Enabling interoperability between different frameworks and streamlining the path from research to production will increase the speed of innovation in the AI community. We are an early stage and we invite the community to submit feedback and help us further evolve ONNX.

Use ONNX

Start experimenting today:

Learn about ONNX spec

Check ONNX design choices and internals:

Tools

Programming utilities for working with ONNX Graphs

Contribute

ONNX is a community project. We encourage you to join the effort and contribute feedback, ideas, and code. You can join one of the working groups and help shape the future of ONNX.

Check out our contribution guide and call for contributions to get started.

Discuss

We encourage you to open Issues, or use Gitter for more real-time discussion: Join the chat at https://gitter.im/onnx/Lobby

Follow Us

Stay up to date with the latest ONNX news. [Facebook] [Twitter]

Installation

Binaries

A binary build of ONNX is available from Conda, in conda-forge:

conda install -c conda-forge onnx

Source

You will need an install of protobuf and numpy to build ONNX. One easy way to get these dependencies is via Anaconda:

# Use conda-forge protobuf, as default doesn't come with protoc
conda install -c conda-forge protobuf numpy

You can then install ONNX from PyPi (Note: Set environment variable ONNX_ML=1 for onnx-ml):

pip install onnx

You can also build and install ONNX locally from source code:

git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive
python setup.py install

Note: When installing in a non-Anaconda environment, make sure to install the Protobuf compiler before running the pip installation of onnx. For example, on Ubuntu:

sudo apt-get install protobuf-compiler libprotoc-dev
pip install onnx

After installation, run

python -c "import onnx"

to verify it works. Note that this command does not work from a source checkout directory; in this case you'll see:

ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'

Change into another directory to fix this error.

Testing

ONNX uses pytest as test driver. In order to run tests, first you need to install pytest:

pip install pytest-cov nbval

After installing pytest, do

pytest

to run tests.

Development

Check out contributor guide for instructions.

License

MIT License

Code of Conduct

ONNX Open Source Code of Conduct