You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I had to run a nvidia-tritonserver:21.10-py3 docker, with onnxruntime==1.9 built-in, and somehow when i start the server with the exported onnx model, i came up with an error like
...
Opset 3 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 2.
Then i checked the doc mentioned above and found the conflict.
so I reinstall the onnxruntime via pip in my model-conversion-env and get following results
onnx version for conversion
onnxruntime version/onnxruntime-gpu for conversion
tritonserver behavier
1.11
1.11
...Opset 3 is under development and support for this is limited. ...
1.10
1.11
...Opset 3 is under development and support for this is limited. ...
1.10
1.10
started normally
1.11
1.10
started normally
Notes
The text was updated successfully, but these errors were encountered:
xiaoFine
changed the title
A mismatch between Onnx Version & ML Opset Version
A conflict doc abount compatibility between Onnx Version & ML Opset Version
May 9, 2024
This simply means ONNX Runtime takes that ONNX version as a dependency but has not implemented full support for the up-to-date opsets. Is running an image with a newer ONNX Runtime installed possible?
This simply means ONNX Runtime takes that ONNX version as a dependency but has not implemented full support for the up-to-date opsets. Is running an image with a newer ONNX Runtime installed possible?
for now I had to run this version due to some hardware/driver problem.
I update my test table and just get more confused why onnxruntime==1.10 works with both onnx==1.10 and onnx==1.11
Ask a Question
Question
There is a conflict in official doc about the MLOpset version supported by Onnx==1.11
in ONNX Runtime compatibility - ONNX opset support
onnx==1.11 support ml-opset=2
in ONNX Versioning-
onnx==1.11 support ml-opset=3
Further information
I had to run a nvidia-tritonserver:21.10-py3 docker, with onnxruntime==1.9 built-in, and somehow when i start the server with the exported onnx model, i came up with an error like
Then i checked the doc mentioned above and found the conflict.
so I reinstall the onnxruntime via pip in my model-conversion-env and get following results
Notes
The text was updated successfully, but these errors were encountered: