-
Notifications
You must be signed in to change notification settings - Fork 143
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to export model to ONNX #9
Comments
same question. |
@ArielleBF @ximitiejiang can you share more information on the export? I will try to export one and share it. |
|
hello, i tried to export onnx with below codes which are similiar with SAM
and i got error as below: |
@ArielleBF @ximitiejiang thanks for sharing the information. We will be trying to export one. |
same question. |
1 similar comment
same question. |
Same problem here trying to export 'efficientsam_ti_gpu.jit' to ONNX using PyTorch 2.0.1 and opset_version set to 18. |
Same problem |
1 similar comment
Same problem |
The problem originates from torch.tile here:
This could probably be solved by replacing torch.tile() with tensor.repeat() or using a symbolic for onnx to patch it |
@ArielleBF, @ximitiejiang, @chenin-wang, @alanzhai219, @fPecc, @kaka-lin, @lxfater, @mchaniotakis, EfficientSAM onnx files are available at Hugging Face Space. The export script and running example are provided. Feel free to give it a try. |
Thanks yformer <3 EfficientSAM/export_to_onnx.py Lines 63 to 78 in c9408a7
|
Thanks yformer ~ I also create TensorFlow2.x version and coverted to Thanks!!! |
I have tried to export the encoder of the model to ONNX, but it informs me that the export has failed. Can anyone who has done relevant work give some advice?
The text was updated successfully, but these errors were encountered: