You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
****** Failed tests - 1 tests
FAIL - "ElementwisePreluModule_basic"
Compilation error: Traceback (most recent call last):
File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/framework.py", line 298, in compile_and_run_test
compiled = config.compile(test.program_factory())
File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/configs/onnx_backend.py", line 88, in compile
compiled_module = self.backend.compile(onnx_module)
File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir_e2e_test/onnx_backends/linalg_on_tensors.py", line 55, in compile
run_pipeline_with_repro_report(
File "/home/zjgar/code/torch-mlir/build/tools/torch-mlir/python_packages/torch_mlir/torch_mlir/compiler_utils.py", line 73, in run_pipeline_with_repro_report
raise TorchMlirCompilerError(trimmed_message) from None
torch_mlir.compiler_utils.TorchMlirCompilerError: Lowering TorchFX IR -> Torch Backend IR failed with the following diagnostics:
python exception: Failure while executing pass pipeline:
error: "/Unsqueeze": unsupported by backend contract: tensor with unknown rank
note: "/Unsqueeze": see current operation: %12 = "torch.aten.unsqueeze"(%arg1, %11) : (!torch.vtensor<[?],f32>, !torch.int) -> !torch.vtensor<*,f32>
note: "/Unsqueeze": this is likely due to a missing transfer function in abstract_interp_lib_gen.py
For Torch-MLIR developers, the error can be reproduced with:
$ torch-mlir-opt -pass-pipeline='builtin.module(torch-lower-to-backend-contract{backend-legal-ops=aten.flatten.using_ints,aten.adaptive_avg_pool1d,aten.unflatten.int})' /tmp/UnnammedModule.mlir
Add '-mlir-print-ir-after-all -mlir-disable-threading' to get the IR dump for debugging purpose.
Summary:
Failed: 1
Expectedly Failed: 1
This is due to two facts:
Prelu has unique broadcasting: it assumes the rank 1 weight tensor matches shape with the second input dim. This results in the onnx prelu graph containing an unsqueeze with many axes, so that the weight tensor can be reshaped to [?,1,1,1] for an input of shape [?,?,?,?,?].
For multiple axes, the conversion for onnx.Unsqueeze results in some unspecified ValueTensorTypes (gotten with ::getWithLeastValueSemantics()); Due to the complexity of the generated ir, the resulting shapes cannot be inferred with torch-mlir's shape and dtype calculation passes.
The text was updated successfully, but these errors were encountered:
The ElementwisePreluModule_basic test fails in the e2e testing with onnx config. Removing the xfail for this test and running:
gives the error message:
This is due to two facts:
The text was updated successfully, but these errors were encountered: