Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_tpat.py error #5

Closed
GeneralJing opened this issue Apr 24, 2022 · 4 comments
Closed

test_tpat.py error #5

GeneralJing opened this issue Apr 24, 2022 · 4 comments

Comments

@GeneralJing
Copy link

Traceback (most recent call last):
File "test_tpat.py", line 3860, in
test_abs()
File "test_tpat.py", line 360, in test_abs
op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name)
File "test_tpat.py", line 346, in op_expect
verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result)
File "test_tpat.py", line 251, in verify_with_ort_with_trt
ort_result = get_onnxruntime_output(model, inputs)
File "test_tpat.py", line 225, in get_onnxruntime_output
rep = onnxruntime.backend.prepare(model, "CPU")
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 138, in prepare
return cls.prepare(bin, device, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/backend/backend.py", line 114, in prepare
inf = InferenceSession(model, sess_options=options, providers=providers)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 335, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 370, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_bytes, False, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:47 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::basic_string, int>&, const onnxruntime::logging::Logger&, bool, const string&, int) ONNX Runtime only guarantees support for models stamped with official released onnx opset versions. Opset 16 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx is till opset 15.

@panlinchao
Copy link

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

@GeneralJing
Copy link
Author

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

this really helps to fix the problem. but when the program runs, it occurs another error:

trt cross_check output False
Traceback (most recent call last):
File "test_tpat.py", line 3860, in
test_abs()
File "test_tpat.py", line 360, in test_abs
op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name)
File "test_tpat.py", line 346, in op_expect
verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result)
File "test_tpat.py", line 300, in verify_with_ort_with_trt
assert ret, "result check False"
AssertionError: result check False

@buptqq
Copy link
Collaborator

buptqq commented Apr 27, 2022

Maybe you can try to run pip install onnx==1.10, seems this can help to fix the problem.

this really helps to fix the problem. but when the program runs, it occurs another error:

trt cross_check output False Traceback (most recent call last): File "test_tpat.py", line 3860, in test_abs() File "test_tpat.py", line 360, in test_abs op_expect(node, inputs=[x], outputs=[y], op_type=op_type, op_name=op_name) File "test_tpat.py", line 346, in op_expect verify_with_ort_with_trt(model, inputs, op_name, np_result=np_result) File "test_tpat.py", line 300, in verify_with_ort_with_trt assert ret, "result check False" AssertionError: result check False

run pip install onnxtruntime==1.9.0 and pip install onnx==1.10.0. we have update this to DockerFile. And you can refer to https://github.com/Tencent/TPAT/tree/main/examples if you use tensorflow.

@GeneralJing
Copy link
Author

thank you. later, i will try that.

@buptqq buptqq closed this as completed Aug 26, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants