Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ONNX] Fix concat with empty tensors #87620

Closed

Conversation

thiagocrepaldi
Copy link
Collaborator

Fixes #54410

@pytorch-bot pytorch-bot bot added the release notes: onnx torch.onnx related changes that should show up in the release notes label Oct 24, 2022
@pytorch-bot
Copy link

pytorch-bot bot commented Oct 24, 2022

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/87620

Note: Links to docs will display an error until the docs builds have been completed.

❌ 1 Failures

As of commit 000906e:

The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

torch/onnx/symbolic_opset11.py Outdated Show resolved Hide resolved
# will likely fail due to inputs with different ranks (0 for empty tensor, > 0 for anything else)
nonempty_tensors = []
for t in tensors:
if symbolic_helper._is_constant(t) and not symbolic_helper._get_tensor_dim_size(t, 0):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should tensors with 0 in their dimension size count as "empty tensor" under this context? For example torch.ones(2, 0, 3). (Maybe create a helper function is_empty_tensor)

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is a good question. How would we detect one scenario or the other? both would have value 0

torch/onnx/symbolic_opset9.py Outdated Show resolved Hide resolved
@bdhirsh bdhirsh added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Oct 27, 2022
@thiagocrepaldi thiagocrepaldi force-pushed the thiagofc/fix-cat-dims branch 2 times, most recently from b32c981 to 9409ef5 Compare November 21, 2022 17:04
@thiagocrepaldi
Copy link
Collaborator Author

thiagocrepaldi commented Dec 5, 2022

@pytorchbot merge -f "unrelated pipeline failure"

@pytorch pytorch deleted a comment from pytorch-bot bot Dec 5, 2022
@thiagocrepaldi
Copy link
Collaborator Author

@pytorchbot merge -f "unrelated pipeline failure"

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

@titaiwangms titaiwangms mentioned this pull request Dec 7, 2022
kulinseth pushed a commit to kulinseth/pytorch that referenced this pull request Dec 10, 2022
@thiagocrepaldi thiagocrepaldi deleted the thiagofc/fix-cat-dims branch May 4, 2023 17:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Merged module: onnx Related to torch.onnx open source release notes: onnx torch.onnx related changes that should show up in the release notes topic: bug fixes topic category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

I converted pytorch model to onnx, but failed to run onnx.
5 participants