-
-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Script to convert RAFT models #387
Comments
I can't understand your intent because there is no explanation at all of what would be wrong with a GridSample that is now available with opset>=16. |
Maybe I can rephrase the question this way: I have observed that your models are much faster than my own converted ones. Did you do any model optimization? |
All of my generated models committed to this zoo have been specially optimized. All five years. Thus, for RAFT, the special optimization work required to run on the old runtime environment, which is more than two years old, was necessary. Essentially, I believe that as of 2024, the system will run at high speed without special optimization work. However, I am not really interested in optimizing an architecture that is several years old, since RAFT is designed to be quite operationally heavy in the architecture itself. |
Issue Type
Documentation Feature Request
OS
Other
OS architecture
Other
Programming Language
Other
Framework
PyTorch
Model name and Weights/Checkpoints URL
252_RAFT
Description
Hello again :)
I have been trying to reproduce your ONNX files myself, however, I do get different results:
When I export my model with
torch.onnx.export
I have to use opset version 16 and this addsGridSample
operations in the ONNX graph. However, in your ONNX files there are no such operations included and I see that they have been replaced byGatherElements
.So my question is, how can I create models similar to yours?
Relevant Log Output
No response
URL or source code for simple inference testing code
No response
The text was updated successfully, but these errors were encountered: