Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.
-
Updated
Jun 29, 2020 - C++
Demonstrate how to use ONNX importer API in Intel OpenVINO toolkit. This API allows user to load an ONNX model and run inference with OpenVINO Inference Engine.
Add a description, image, and links to the onnx-model topic page so that developers can more easily learn about it.
To associate your repository with the onnx-model topic, visit your repo's landing page and select "manage topics."