Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[primTorch] Meta function for item creates a dummy value #78070

Open
mruberry opened this issue May 23, 2022 · 2 comments
Open

[primTorch] Meta function for item creates a dummy value #78070

mruberry opened this issue May 23, 2022 · 2 comments
Labels
module: primTorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@mruberry
Copy link
Collaborator

mruberry commented May 23, 2022

The Meta function for item currently creates a dummy value. This is harmless for now, but we need improve our modeling of "meta numbers" to indicate that this value is unknown except at runtime.

cc @ezyang @mruberry @ngimel

@mruberry mruberry added triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module module: primTorch labels May 23, 2022
pytorchmergebot pushed a commit that referenced this issue May 23, 2022
This PR adds the item, equal, any, and all references.

While doing this I found the following issues:
- #78070
- #78071

And I fixed a bug where the `convert_element_type` prim could not convert tensors requiring grad to datatypes that don't require grad.

Creating the item reference required adding item as a prim, but per @ngimel's suggestion I removed the prims for any and all and implemented them as references, so this is net negative one prim.

Reference OpInfos are added for any and all, but item and equal don't even have regular OpInfos.
Pull Request resolved: #78072
Approved by: https://github.com/ngimel
@ezyang
Copy link
Contributor

ezyang commented May 23, 2022

Given that item calls should induce a graph break, I'm not sure why we need meta numbers here.

@mruberry
Copy link
Collaborator Author

Graph breaking on item() is fine but we still need some way to signify we should do so

facebook-github-bot pushed a commit that referenced this issue May 24, 2022
Summary:
This PR adds the item, equal, any, and all references.

While doing this I found the following issues:
- #78070
- #78071

And I fixed a bug where the `convert_element_type` prim could not convert tensors requiring grad to datatypes that don't require grad.

Creating the item reference required adding item as a prim, but per ngimel's suggestion I removed the prims for any and all and implemented them as references, so this is net negative one prim.

Reference OpInfos are added for any and all, but item and equal don't even have regular OpInfos.

Pull Request resolved: #78072
Approved by: https://github.com/ngimel

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/2738405a76a6210140b95912a2159969ece6a7fb

Reviewed By: seemethere

Differential Revision: D36610490

Pulled By: mruberry

fbshipit-source-id: 25cada3079a1ddd44d0dc9ddf1ccd0cee3417bf8
swang392 pushed a commit that referenced this issue May 25, 2022
This PR adds the item, equal, any, and all references.

While doing this I found the following issues:
- #78070
- #78071

And I fixed a bug where the `convert_element_type` prim could not convert tensors requiring grad to datatypes that don't require grad.

Creating the item reference required adding item as a prim, but per @ngimel's suggestion I removed the prims for any and all and implemented them as references, so this is net negative one prim.

Reference OpInfos are added for any and all, but item and equal don't even have regular OpInfos.
Pull Request resolved: #78072
Approved by: https://github.com/ngimel
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: primTorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

2 participants