-
Notifications
You must be signed in to change notification settings - Fork 694
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to convert DINOv2 to ONNX? #216
Comments
changing this bit in patch_pos_embed = nn.functional.interpolate(
patch_pos_embed.reshape(1, int(math.sqrt(N)), int(math.sqrt(N)), dim).permute(0, 3, 1, 2),
scale_factor=(float(w0 / math.sqrt(N)), float(h0 / math.sqrt(N))),
mode="bicubic",
) |
thanks @seddonm1 fort the work around, it works for me for batch size 1. However, I am trying to have dynamic batch size, i am able to convert the model to ONNX with dynamic batch size, but when I load it I get error, anyone managed? |
I've exported class token + patch tokens: #167 (comment) |
I've tried to export to ONNX using dynamic input and output shapes. The model is exported and seems fine, however the ONNX model throws an exception during inference when the input is not the same as the input sample fed during the export. For example, when I export a model with an input with shape
|
@barbolo , hello, I got the same error. Have you figured out how to solve this porblem? |
@WulongGuo no, I haven't. And I'm not sure there is a solution. I've seen other ViT like repositories with downloadable ONNX/OpenVINO models and all of them have fixed input shapes. For my use case, I'm interested in reducing the inference time, so I've exported the model for the input shapes I'm using and I'm loading them in memory. This approach uses more memory, but the inference time is optimized. |
@barbolo ok, thanks for your reply. I would just use the fixed-input version. |
i also met this problem, but the model can inference with difference input shapes in python before exported, but when i exported the onnx model, the onnx model only can inference on the input that have same w&h, can't inference on other shapes |
i'll appreciate if you could solve this problem |
I'm facing the same issue. It is something with the nodes in the model. Although we make the image shape dimensions dynamic during export, those are somehow still static in the ONNX model. Then during inference, it throws the above-mentioned error. This is also reflected in the warnings while exporting to the ONNX model. Seems like we can only use static shapes (although the model can be exported with dynamic axes)! The downside of this issue is we have to now downscale or upscale the images to a static shape 😢.
|
have you solved this problem? |
@Zalways Nope. |
@100rab-S Hello, I've encountered a similar issue myself. I'm curious to know, would there be any adverse effects if I resize the images to match the static input size? |
Hi. Thanks for your great works.
I want to convert dinov2 to onnx, but failed.
I try to refer #19 this Issue.
I apply #19 (comment) this, after that, #19 (comment) this error occur.
So, I try to apply #19 (comment) this, but error still occur.
Are there any guidelines for onnx converting?
I need to use this model quickly for semantic segmentation tasks.
Thanks.
The text was updated successfully, but these errors were encountered: