-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test_tpat.py error #5
Comments
Maybe you can try to run |
this really helps to fix the problem. but when the program runs, it occurs another error: trt cross_check output False |
run |
thank you. later, i will try that. |
Traceback (most recent call last):
File "test_tpat.py", line 3860, in
test_abs()
File "test_tpat.py", line 360, in test_abs
op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name)
File "test_tpat.py", line 346, in op_expect
verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result)
File "test_tpat.py", line 251, in verify_with_ort_with_trt
ort_result = get_onnxruntime_output(model, inputs)
File "test_tpat.py", line 225, in get_onnxruntime_output
rep = onnxruntime.backend.prepare(model, "CPU")
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 138, in prepare
return cls.prepare(bin, device, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 114, in prepare
inf = InferenceSession(model, sess_options=options, providers=providers)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::basic_string, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 16 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx is till opset 15.
The text was updated successfully, but these errors were encountered: