Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Unsqueeze -> Torch -> Linalg issue #3101

Open
zjgarvey opened this issue Apr 3, 2024 · 1 comment
Open

[ONNX] Unsqueeze -> Torch -> Linalg issue #3101

zjgarvey opened this issue Apr 3, 2024 · 1 comment
Assignees

Comments

@zjgarvey
Copy link
Collaborator

zjgarvey commented Apr 3, 2024

The ElementwisePreluModule_basic test fails in the e2e testing with onnx config. Removing the xfail for this test and running:

projects/pt1/tools/e2e_test.sh -f ElementwisePreluModule_basic -v -c onnx

gives the error message:

****** Failed tests - 1 tests
    FAIL - "ElementwisePreluModule_basic"
        Compilation error: Traceback (most recent call last):
          File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/framework.py", line 298, in compile_and_run_test
            compiled = config.compile(test.program_factory())
          File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/configs/onnx_backend.py", line 88, in compile
            compiled_module = self.backend.compile(onnx_module)
          File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/onnx_backends/linalg_on_tensors.py", line 55, in compile
            run_pipeline_with_repro_report(
          File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir/compiler_utils.py", line 73, in run_pipeline_with_repro_report
            raise TorchMlirCompilerError(trimmed_message) from None
        torch_mlir.compiler_utils.TorchMlirCompilerError: Lowering TorchFX IR -> Torch Backend IR failed with the following diagnostics:


        python exception: Failure while executing pass pipeline:
        error: "/Unsqueeze": unsupported by backend contract: tensor with unknown rank
        note: "/Unsqueeze": see current operation: %12 = "torch.aten.unsqueeze"(%arg1, %11) : (!torch.vtensor<[?],f32>, !torch.int) -> !torch.vtensor<*,f32>
        note: "/Unsqueeze": this is likely due to a missing transfer function in abstract_interp_lib_gen.py

        For Torch-MLIR developers, the error can be reproduced with:
        $ torch-mlir-opt -pass-pipeline='builtin.module(torch-lower-to-backend-contract{backend-legal-ops=aten.flatten.using_ints,aten.adaptive_avg_pool1d,aten.unflatten.int})' /tmp/UnnammedModule.mlir
        Add '-mlir-print-ir-after-all -mlir-disable-threading' to get the IR dump for debugging purpose.



Summary:
    Failed: 1
    Expectedly Failed: 1

This is due to two facts:

  1. Prelu has unique broadcasting: it assumes the rank 1 weight tensor matches shape with the second input dim. This results in the onnx prelu graph containing an unsqueeze with many axes, so that the weight tensor can be reshaped to [?,1,1,1] for an input of shape [?,?,?,?,?].
  2. For multiple axes, the conversion for onnx.Unsqueeze results in some unspecified ValueTensorTypes (gotten with ::getWithLeastValueSemantics()); Due to the complexity of the generated ir, the resulting shapes cannot be inferred with torch-mlir's shape and dtype calculation passes.
@zjgarvey
Copy link
Collaborator Author

zjgarvey commented Apr 3, 2024

#2991

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant