-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ONNX shape inference does not infer shapes #2903
Comments
Hi @dtch1997,
|
Good workaround. Adding a line |
Hi @deepak0896, |
Hi @jcwchen, name: "377" |
Thank you for the details. I found the shapes of nodes after every Split node will be missing... There might be a shape inference bug for Split-10... Perhaps #2549 can help |
I have confirmed that PR can help and you can try that first. I will push that PR forward. Thanks. |
I have tried all of the solutions mentioned and I'm still facing the same issue "IndexError: Input hidden_x1.0.weight is undefined!" with the ONNX model here. I'm on ONNX 1.8, do you know what I might be doing wrong? |
Actually I can |
Still have this problem in onnx==1.10.1. A lot of shapes are missing when analyzing a gpt model. |
Hi @nicklhy, |
Thanks for the quick reply ~ |
Thanks for providing the link -- It seem that I cannot open it somehow. Will try it later |
just execute |
Sorry with wget I cannot get the model either. Perhaps it cannot be accessed in the United States? Do you have another way to provide this model? Thank you! |
Try this google drive link: https://drive.google.com/file/d/1xWMrAozQEvrPAwXKJO69vOOsor66dIr7/view?usp=sharing |
Thank you for providing the models! I roughly checked the produced graph.value_info after onnx.shape_inference and the result looks normal to me. Please note that actually ONNX Shape inference is not guaranteed to be complete. In particular, some dynamic behaviors block the flow of shape inference, for example a Reshape to a dynamically-provide shape. As you can see, many output from Reshape node is missing shape due to this kind of dynamic behavior. Recently ONNX does improve shape inference for more dynamic scenarios (e.g., Reshape) by data propagation, but the supported ops are limited for now. Take your model as an example, Gemm is not supported yet so that's why enabling data propagation cannot help the following Reshape infer shape either. However, if there is any static shape inference with registered shape inference function failed (just like Split op bug in this thread), please do let me know and let's try to resolve it. Thanks! More reference: https://github.com/onnx/onnx/blob/master/docs/ShapeInference.md, https://github.com/onnx/onnx/blob/master/docs/proposals/SymbolicShapeInfProposal.md |
@jcwchen Quick question, is there any convenient one-line code to remove the shape inference information from an ONNX model? Thank you. |
@leimao perhaps try something like |
That does work. Thank you. Perhaps you guys can consider adding this interface to ONNX Python API.
which is more similar to Python list syntax, given |
Actually the utilities here come from protobuf since it's a model proto. Perhaps you can raise this concern there. Thank you for the suggestion. |
The shape inference problem still persists as lot of shapes are missing when trying to parse the gpt2 onnx model from https://github.com/onnx/models/blob/main/text/machine_comprehension/gpt-2/README.md . I am using onnx version 1.13.1 . Perhaps the problem is about dynamic shape inference. Can anyone suggest any way-out to tackle dynamic shape inference? |
@mananta did you find any solution regarding this? |
Bug Report
Describe the bug
onnx.shape_inference.infer_shapes
does not correctly infer shape of each layer.System information
Reproduction instructions
output:
Model file: models.zip
Expected behavior
Expected each entry in
model.graph.value_info
to have tensor shape field which tells me the shape of that layer.Notes
Model was exported from PyTorch using
torch.onnx.export
The text was updated successfully, but these errors were encountered: