-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upsample exports to a very big subgraph in dynamo #1533
Comments
I also tried to get more info on which part of dynamo/onnxscript might be responsible for this.
I get this:
If I run
I get an error:
|
Thanks for catching this. Very intriguing. Will take a look! |
@yuanyao-nv could you obtain the graph module from |
@justinchuby Is this what you mean?
which gives
|
Yes, thank you |
That's very strange. If you run |
@justinchuby Is this the procedure you're suggesting?
The exported UpSample module looks about the same as before, still a very big graph. |
I don't see any resize ops, which is puzzling. Could you share the onnx model? You may remove the weights if it is too big |
I was expecting to see this function:
|
@justinchuby I uploaded the two versions of the model here: https://drive.google.com/drive/folders/1s1lhKRuG6fOZmD4IjZvN_zlWfIxPB_8w?usp=sharing |
It's possible that the upsample op was somehow decomposed by PyTorch. I will look deeper. |
I have found that in general case, one has to run exported_program.run_decompositions() before applying dynamo_export(). |
Thanks. We will be creating a series of changes to the exporter to support ExportedPrograms properly, including handling of the weights. |
@borisfom I tried running
|
There are two issues here:
|
Reference on https://github.com/microsoft/onnxscript/pull/1255/files Partially fixes #1533 According to https://github.com/pytorch/pytorch/blob/78a6b0c4793d93d0a9105d9c92e7b88794016e66/aten/src/ATen/native/native_functions.yaml#L12500, .vec overload is different from default overload in `upsample_trilinear`.
Hi @yuanyao-nv, This one should be fixed when you call cc @gramalingam @justinchuby @xadupre This forcing decomposition would need us to maybe rewriting them as patterns. It will come back to us once we rely on torch.export.export. |
@titaiwangms Thanks for the update. What's a good way to test it out?
Do you know why fake tensor is being used in the latest torch version? I also tried exporting just a
The exported graph looks reasonable. Is this what you'd expect? |
Filed a separate issue to track the above fake tensor broadcast error pytorch/pytorch#129534 |
I'm looking at some models in MONAI which involves
torch.nn.Upsample
. I notice that torchscript exports the Upsample module to aResize
node but dynamo exports it to a very big graph and has a perf impact.An example is the SegResNet model.
torchscript:
dynamo:
expanding the dynamo subgraph:
Here's the export script:
Relevant versions:
onnx==1.16.0
onnxscript==0.1.0.dev20240513
torch==2.4.0.dev20240513+cu121
monai 1.3.0
The text was updated successfully, but these errors were encountered: