Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX Inference failed. Non-zero status code returned while running Reshape node. #12

Open
inSight-mk1 opened this issue Dec 17, 2021 · 2 comments

Comments

@inSight-mk1
Copy link

Export yolov5l-xs-1.pt to ONNX format using export.py in this repo,
Then do the detect, ONNX model was loaded successfully, but it failed when running this code
pred = torch.tensor(self.session.run([self.session.get_outputs()[0].name], {self.session.get_inputs()[0].name: img}))
It threw:
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Reshape node. Name:'Reshape_222' Status Message: D:\a\_work\1\s\onnxruntime\core/providers/cpu/tensor/reshape_helper.h:42 onnxruntime::ReshapeHelper::ReshapeHelper gsl::narrow_cast<int64_t>(input_shape.Size()) == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{756,1,512}, requested shape:{756,12096,32}

The same code runs yolov5's onnx model successfully.

Any plan to add onnx convert support for yolov5-TPH? Thanks you!

@inSight-mk1
Copy link
Author

It also failed when converting to pb format. It threw:
TensorFlow saved_model: export failure: name 'C3TR' is not defined

@cv516Buaa
Copy link
Owner

Sorry, we have not used onnx to export the weight of yolov5, I will list it as one of the future work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants