Skip to content
Open standard for machine learning interoperability
PureBasic C++ Python C CMake Jupyter Notebook Other
Branch: master
Clone or download

Latest commit

chinhuang007 Relicensing MIT to Apache-2.0 (#2781)
Add a document to collect approvals for relicensing ONNX repo
files from MIT to Apache-2.0

Invite organizations to add the name of their organization, the
name of the authorized approver, eg legal, and the date the
email was received indicating the approval for relicensing.
Latest commit 78c0ba7 May 19, 2020

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.azure-pipelines Azure Pipelines - Linux / MacOS CIs (#2730) Apr 29, 2020
.circleci Update build.sh (#2752) Apr 28, 2020
.github/workflows Add a release pipeline for Windows python packages (#2632) Mar 3, 2020
.travis Adding CI for ONNX Debug mode (Linux, OSX) (#2651) Mar 26, 2020
cmake Fix CI build break (#2603) Feb 15, 2020
community fix the infra gitter room address (#2771) May 18, 2020
conda Merge release branch back to master (#512) Feb 7, 2018
docs Relicensing MIT to Apache-2.0 (#2781) May 20, 2020
onnx Fix propagateShapeFromInputToOutput (#2778) May 16, 2020
stubs Fix .gitignore and add missing files (#1005) May 22, 2018
third_party Update pybind (#2340) (#2688) Mar 31, 2020
tools Training Proposal: Spec Changes and Gradient Operator (#2314) Feb 17, 2020
.clang-format Add clang-format rules file Nov 4, 2017
.gitignore Changes done internally at Facebook (#2035) May 21, 2019
.gitmodules Microbenchmark for encoding+decoding ModelProto and GraphProto with a… Mar 15, 2018
.travis.yml Adding CI for ONNX Debug mode (Linux, OSX) (#2651) Mar 26, 2020
CMakeLists.txt Use cmake GNUInstallDirs (#2661) Mar 30, 2020
CODEOWNERS Update codeowners to have community folder changes assigned to steeri… Jun 19, 2019
LICENSE Update LICENSE formatting and clarify # of WG chairs (#1907) Apr 4, 2019
MANIFEST.in ONNX v1.3.0 release (#1359) Aug 31, 2018
README.md Update ONNX build instructions for readability, ARM 64 instructions (#… May 14, 2020
RELEASE-MANAGEMENT.md Spec clarity: Versioning (#931) May 16, 2018
VERSION_NUMBER Increment version number to 1.7.0 (#2639) Mar 9, 2020
pyproject.toml Try using pep518 to install the protobuf build dependency (#782) Apr 19, 2018
setup.cfg Make CI log less verbose (#1595) Nov 9, 2018
setup.py Update setup info. (#2774) May 15, 2020

README.md

Build Status Build Status Build Status

Open Neural Network Exchange (ONNX) is an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models, both deep learning and traditional ML. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Currently we focus on the capabilities needed for inferencing (scoring).

ONNX is widely supported and can be found in many frameworks, tools, and hardware. Enabling interoperability between different frameworks and streamlining the path from research to production helps increase the speed of innovation in the AI community. We invite the community to join us and further evolve ONNX.

Use ONNX

Learn about the ONNX spec

Programming utilities for working with ONNX Graphs

Contribute

ONNX is a community project. We encourage you to join the effort and contribute feedback, ideas, and code. You can participate in the SIGs and Working Groups to shape the future of ONNX.

Check out our contribution guide to get started.

If you think some operator should be added to ONNX specification, please read this document.

Discuss

We encourage you to open Issues, or use Gitter for more real-time discussion: Join the chat at https://gitter.im/onnx/Lobby

Follow Us

Stay up to date with the latest ONNX news. [Facebook] [Twitter]

Installation

Binaries

A binary build of ONNX is available from Conda, in conda-forge:

conda install -c conda-forge onnx

Source

Linux and MacOS

You will need an install of Protobuf and NumPy to build ONNX. One easy way to get these dependencies is via Anaconda:

# Use conda-forge protobuf, as default doesn't come with protoc
conda install -c conda-forge protobuf numpy

You can then install ONNX from PyPi (Note: Set environment variable ONNX_ML=1 for onnx-ml):

pip install onnx

Alternatively, you can also build and install ONNX locally from source code:

git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive
python setup.py install

Note: When installing in a non-Anaconda environment, make sure to install the Protobuf compiler before running the pip installation of onnx. For example, on Ubuntu:

sudo apt-get install protobuf-compiler libprotoc-dev
pip install onnx

Windows

If you are building ONNX from source on Windows, it is recommended that you also build Protobuf locally as a static library. The version distributed with conda-forge is a DLL and this is a conflict as ONNX expects it to be a static library.

Build Protobuf and ONNX on Windows

Step 1: Build Protobuf locally

git clone https://github.com/protocolbuffers/protobuf.git
cd protobuf
git checkout 3.9.x
cd cmake
# Explicitly set -Dprotobuf_MSVC_STATIC_RUNTIME=OFF to make sure protobuf does not statically link to runtime library
cmake -G "Visual Studio 15 2017 Win64" -Dprotobuf_MSVC_STATIC_RUNTIME=OFF -Dprotobuf_BUILD_TESTS=OFF -Dprotobuf_BUILD_EXAMPLES=OFF -DCMAKE_INSTALL_PREFIX=<protobuf_install_dir>
msbuild protobuf.sln /m /p:Configuration=Release
msbuild INSTALL.vcxproj /p:Configuration=Release

Step 2: Build ONNX

# Get ONNX
git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive

# Set environment variables to find protobuf and turn off static linking of ONNX to runtime library.
# Even better option is to add it to user\system PATH so this step can be performed only once.
# For more details check https://docs.microsoft.com/en-us/cpp/build/reference/md-mt-ld-use-run-time-library?view=vs-2017
set PATH=<protobuf_install_dir>\bin;%PATH%
set USE_MSVC_STATIC_RUNTIME=0

# Optional: Set environment variable `ONNX_ML=1` for onnx-ml

# Build ONNX
python setup.py install

If you would prefer to use Protobuf from conda-forge instead of building Protobuf from source, you can use the following instructions.

Build ONNX on Windows with Anaconda

# Use conda-forge protobuf
conda install -c conda-forge numpy libprotobuf=3.11.3 protobuf

# Get ONNX
git clone https://github.com/onnx/onnx.git
cd onnx
git submodule update --init --recursive

# Set environment variable for ONNX to use protobuf shared lib
set USE_MSVC_STATIC_RUNTIME=0
set CMAKE_ARGS="-DONNX_USE_PROTOBUF_SHARED_LIBS=ON -DProtobuf_USE_STATIC_LIBS=OFF -DONNX_USE_LITE_PROTO=ON"

# Build ONNX
# Optional: Set environment variable `ONNX_ML=1` for onnx-ml

python setup.py install

Build ONNX on ARM 64

If you are building ONNX on an ARM 64 device, please make sure to install the dependencies appropriately.

pip install cython protobuf numpy
sudo apt-get install libprotobuf-dev protobuf-compiler
pip install onnx

Verify Installation

After installation, run

python -c "import onnx"

to verify it works.

Common Errors

Environment variables: USE_MSVC_STATIC_RUNTIME (should be 1 or 0, not ON or OFF)

CMake variables: ONNX_USE_PROTOBUF_SHARED_LIBS, Protobuf_USE_STATIC_LIBS

If ONNX_USE_PROTOBUF_SHARED_LIBS is ON then Protobuf_USE_STATIC_LIBS must be OFF and USE_MSVC_STATIC_RUNTIME must be 0.
If ONNX_USE_PROTOBUF_SHARED_LIBS is OFF then Protobuf_USE_STATIC_LIBS must be ON and USE_MSVC_STATIC_RUNTIME can be 1 or 0.

Note that the import onnx command does not work from the source checkout directory; in this case you'll see ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'. Change into another directory to fix this error.

Building ONNX on Ubuntu works well, but on CentOS/RHEL and other ManyLinux systems, you might need to open the CMakeLists file and replace all instances of /lib with /lib64.

If you want to build ONNX on Debug mode, remember to set the environment variable DEBUG=1. For debug versions of the dependencies, you need to open the CMakeLists file and append a letter d at the end of the package name lines. For example, NAMES protobuf-lite would become NAMES protobuf-lited.

You can also use the onnx-dev docker image for a Linux-based installation without having to worry about dependency versioning.

Testing

ONNX uses pytest as test driver. In order to run tests, you will first need to install pytest:

pip install pytest nbval

After installing pytest, use the following command to run tests.

pytest

Development

Check out the contributor guide for instructions.

License

MIT License

Code of Conduct

ONNX Open Source Code of Conduct

You can’t perform that action at this time.