Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

crash in rewriter/broadcast_to_matmul.py (proposed fix) #1541

Closed
borisfom opened this issue May 15, 2024 · 3 comments · Fixed by #1542
Closed

crash in rewriter/broadcast_to_matmul.py (proposed fix) #1541

borisfom opened this issue May 15, 2024 · 3 comments · Fixed by #1542
Assignees
Labels
bug Something isn't working topic: rewriter

Comments

@borisfom
Copy link

borisfom commented May 15, 2024

My model fails during torch.onnx.dynamo_export here with "AttributeError: 'NoneType' object has no attribute 'numpy'":

shape_c = shape_c_value.const_value.numpy() # type: ignore[union-attr]

Obviously, "if shape_c is None:" check should have been before that line, not after.
Here is a one-line diff that fixes the problem. Resulting ONNX then works and compares with PyTorch results:

--- broadcast_to_matmul.bak     2024-05-14 19:22:51.489899788 +0000
+++ broadcast_to_matmul.py      2024-05-14 19:49:17.761779925 +0000
@@ -28,9 +28,10 @@
     input_b_shape = input_b.shape
     # TODO: Get a helper func to get const_value
     shape_c_value = _ir_utils.propagate_const_value(shape_c)
-    shape_c = shape_c_value.const_value.numpy()  # type: ignore[union-attr]
+    shape_c = shape_c_value.const_value  # type: ignore[union-attr]
     if shape_c is None:
         return False
+    shape_c = shape_c.numpy()  # type: ignore[union-attr]
     if not isinstance(shape_c, np.ndarray):
         logger.info("Unexpected shape_c value. Expected np.ndarray, got %s", type(shape_c))
         return False
@borisfom
Copy link
Author

This is a repro, just in case. You have to install NVIDIA NeMo toolkit first :
pip install --extra-index-url https://pypi.ngc.nvidia.com/ nemo_toolkit[all]

import torch
from nemo.core.classes import typecheck
from nemo.utils.export_utils import wrap_forward_method
from nemo.collections.asr.models import EncDecCTCModelBPE
import onnxscript

model = EncDecCTCModelBPE.from_pretrained(model_name="stt_en_conformer_ctc_small")

model = model.eval()
wrap_forward_method(model)
model._prepare_for_export()

typecheck.set_typecheck_enabled(enabled=False)

with torch.no_grad():
    input_example = model.input_module.input_example(max_batch=4, max_dim=256)
    # This has to have min=2                                                                                                              
    batch = torch.export.Dim("batch", min=2, max=64)
    dynamic_shapes={'audio_signal': { 0: batch,
                                      # 2: torch.export.Dim("audio_signal__2")                                                            
                              },
                    'length': {0: batch}
                    }

    ex_model = torch.export.export(
        model, tuple(input_example),
        dynamic_shapes=dynamic_shapes,
        strict=False
    )

    ex_model = ex_model.run_decompositions()
    ex_module=ex_model.module()

    print("Running torch.onnx.dynamo_export ...")
    options = torch.onnx.ExportOptions(dynamic_shapes=True)
    ex = torch.onnx.dynamo_export(ex_module, *input_example, export_options=options)
    ex.save("ctc.onnx")

@justinchuby
Copy link
Contributor

Thank you!

@justinchuby justinchuby self-assigned this May 15, 2024
@justinchuby
Copy link
Contributor

I will create a patch today

@justinchuby justinchuby added bug Something isn't working topic: rewriter labels May 15, 2024
justinchuby added a commit that referenced this issue May 20, 2024
- Check whether the shape tensor is constant before using it in the
logic. Exiting early if needed.
- Handle cases when the input is 1d or 0d

Thanks @borisfom for the proposed fix!

Fix #1541
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working topic: rewriter
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants