New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
onnx result and torch result don't match #6
Comments
I think the simplification doesn't affect the results. |
yeah, it's a pleasure to share my onnx models, incuding pfe and rpn. But how can i send them to you? |
Your email address? |
you can upload your models to google drive |
Hi, Carkusl:
thanks a lot for your work.
I use the same tools and the same config to turn torch model to onnx model; I get pfe_onnx and rpn_onnx. Before simplify the onnx model, I use onnxruntime to test the onnx result to validation if the reuslt consistent with the torch model. But I got the negative results.
Wheather the pfe or the rpn, the onnx inferenc results are very different with the torch results.
I have not do the simplify. dose the simiplification work for the big difference of the model-inference-result?
Do you have any idea about the problem.
The text was updated successfully, but these errors were encountered: