New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SparseInst Model exportation to ONNX #21
Comments
@ayoolaolafenwa Hi, thank u for your interest. Theoretically it should able to export to onnx. What's error messages did got when export SparseInst? |
I export using I got this error message |
@ayoolaolafenwa please try it again, it should supported now. |
Thank you! I will test it! |
Hi jinfagang, Thank you for your excellent work. When I try to export onnx file, still got "IndexError: too many indices for tensor of dimension 3" , command just as in readme:
Any advice? |
@wangyidong3 If the output says onnx saved into file, then ignore the last error. It was normal since I didn't fully catch the logic verification. |
Did you have any luck running inference on the exported onnx model? =) |
@wangyidong3 @ayoolaolafenwa could anyone solve this issue.I am getting the same error. images = [x["image"].to(self.device) for x in batched_inputs] |
Hi, the onnx exported from yolov7-d2, should already included prerpocess, which means, you should not need permute HWC to CHW, and not need to do normalization. it was inside model already. |
Thank you so much for your work @jinfagang. I have tested yolov7 and I realized that SparseInst models cannot be converted to ONNX. Is the export onnx code compatible with exporting SparseInst models?
The text was updated successfully, but these errors were encountered: