Linux | Windows |
---|---|
Open Neural Network Exchange (ONNX) is the first step toward an open ecosystem that empowers AI developers to choose the right tools as their project evolves. ONNX provides an open source format for AI models. It defines an extensible computation graph model, as well as definitions of built-in operators and standard data types. Initially we focus on the capabilities needed for inferencing (evaluation).
Caffe2, PyTorch, Microsoft Cognitive Toolkit, Apache MXNet and other tools are developing ONNX support. Enabling interoperability between different frameworks and streamlining the path from research to production will increase the speed of innovation in the AI community. We are an early stage and we invite the community to submit feedback and help us further evolve ONNX.
Start experimenting today:
Check ONNX design choices and internals:
- Overview
- ONNX intermediate representation spec
- Versioning principles of the spec
- Operators documentation
ONNX is a community project. We encourage you to join the effort and contribute feedback, ideas, and code. Check out our contribution guide to get started.
Stay up to date with the latest ONNX news. [Facebook] [Twitter]
A binary build of ONNX is available from Conda, in conda-forge:
conda install -c conda-forge onnx
Docker images (CPU-only and GPU versions) with ONNX, PyTorch, and Caffe2 are availiable for quickly trying tutorials that use ONNX. To quickly try CPU-only version, simply run:
docker run -it --rm onnx/onnx-docker:cpu /bin/bash
To run the version with GPU support, nvidia-docker is needed. Execute:
nvidia-docker run -it --rm onnx/onnx-docker:gpu /bin/bash
You will need an install of protobuf and numpy to build ONNX. One easy way to get these dependencies is via Anaconda:
# Use conda-forge protobuf, as defaults doesn't come with protoc
conda install -c conda-forge protobuf numpy
You can then install ONNX from PyPi (Note: Set environment variable ONNX_ML=1
for onnx-ml):
pip install onnx
Note: When installing in a non-Anaconda environment, make sure to install the Protobuf compiler before running the pip installation of onnx. For example, on Ubuntu:
sudo apt-get install protobuf-compiler libprotoc-dev
pip install onnx
After installation, run
python -c "import onnx"
to verify it works. Note that this command does not work from a source checkout directory; in this case you'll see:
ModuleNotFoundError: No module named 'onnx.onnx_cpp2py_export'
Change into another directory to fix this error.
ONNX uses pytest as test driver. In order to run tests, first you need to install pytest:
pip install pytest-cov nbval
After installing pytest, do
pytest
to run tests.
Check out contributor guide for instructions.