-
Notifications
You must be signed in to change notification settings - Fork 433
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ONNX conversion: Only tensor input is valid Argument #1560
Comments
Thanks, @AdrianEddy, for filing this issue. This helps us prioritize the ONNX ops. OK. I have checked the ONNX file and it contains lots of nodes (> 10K) but with a few types:
Unsupported in ONNX import currently:
The full list of supported ops: https://github.com/tracel-ai/burn/blob/main/crates/burn-import/SUPPORTED-ONNX-OPS.md These missing ops came up recently in this issue as well: #1544 The reason why Tagging @nathanielsimard so he's aware of missing ONNX ops. |
If we handle #1544 fully then this ticket will get resolved too. |
I fixed unsqueeze scalar (#1690) issue but it's blocked on converting Gemm node to linear module:
|
Describe the bug
When trying to convert onnx model to Burn, I'm encountering this panic:
The
node_input
on that line is{ name: "cast2_out1", ty: Scalar(Int64), value: None, passed: true }
The model I'm trying to convert is gmflow-scale2-regrefine6-mixdata-train320x576-4e7b215d.onnx
I also have alternative onnx model exported by someone else, which errors with a different message, might also be worth looking into: gmflow-scale1-mixdata-train320x576-4c3a6e9a_1x3x480x640_sim.onnx
To Reproduce
Expected behavior
I expected it to finish conversion successfully
Desktop (please complete the following information):
Additional context
Model exported using
torch.onnx.export
from this repoThe text was updated successfully, but these errors were encountered: