-
Notifications
You must be signed in to change notification settings - Fork 25.7k
[composite compliance testing] Refactor check_backward_formula to accept Callable #81059
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ept Callable
Maybe niche, but for one-off debugging purposes, I want a variant of
check_backward_formula that accepts a callable rather than an OpInfo.
This is because when debugging, I try to create a repro that does not
involve OpInfos because OpInfos are difficult to deal with (they have
a lot of sample inputs, I may want to test my own sample inputs without
creating a new OpInfo, etc).
This PR refactors check_backward_formula so that it accepts a Callable
instead of an OpInfo. Example usage:
```
import torch
from torch.testing._internal.composite_compliance import check_backward_formula
x = torch.tensor([[1., 1.], [1., 0.]], requires_grad=True)
args = (x, 1)
check_backward_formula_callable(torch.prod, args, {})
```
Test Plan:
- run existing tests
[ghstack-poisoned]
🔗 Helpful links
✅ No Failures (0 Pending)As of commit cfb812d (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is would really be helpful. Thanks!
I think we should also do this for
pytorch/torch/testing/_internal/composite_compliance.py
Lines 446 to 449 in 8a5d984
| # Checks if the forward AD formula is composite compliant by testing | |
| # all possible permutations of {primals, tangents} being | |
| # CompositeCompliantTensor or regular Tensors. | |
| def check_forward_ad_formula(op, args, kwargs): |
…mula to accept Callable"
Maybe niche, but for one-off debugging purposes, I want a variant of
check_backward_formula that accepts a callable rather than an OpInfo.
This is because when debugging, I try to create a repro that does not
involve OpInfos because OpInfos are difficult to deal with (they have
a lot of sample inputs, I may want to test my own sample inputs without
creating a new OpInfo, etc).
This PR refactors check_backward_formula so that it accepts a Callable
instead of an OpInfo. Example usage:
```
import torch
from torch.testing._internal.composite_compliance import check_backward_formula
x = torch.tensor([[1., 1.], [1., 0.]], requires_grad=True)
args = (x, 1)
check_backward_formula_callable(torch.prod, args, {})
```
Test Plan:
- run existing tests
[ghstack-poisoned]
…ccept Callable Like #81059; this PR addresses the review comments. Test Plan: - run tests [ghstack-poisoned]
…ccept Callable (#81239) Like #81059; this PR addresses the review comments. Test Plan: - run tests Pull Request resolved: #81239 Approved by: https://github.com/ezyang
…ept Callable (#81059) (#81059) Summary: Maybe niche, but for one-off debugging purposes, I want a variant of check_backward_formula that accepts a callable rather than an OpInfo. This is because when debugging, I try to create a repro that does not involve OpInfos because OpInfos are difficult to deal with (they have a lot of sample inputs, I may want to test my own sample inputs without creating a new OpInfo, etc). This PR refactors check_backward_formula so that it accepts a Callable instead of an OpInfo. Example usage: ``` import torch from torch.testing._internal.composite_compliance import check_backward_formula x = torch.tensor([[1., 1.], [1., 0.]], requires_grad=True) args = (x, 1) check_backward_formula_callable(torch.prod, args, {}) ``` Pull Request resolved: #81059 Approved by: https://github.com/kshitij12345, https://github.com/ezyang Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/d253cdd8ff194239e86b70293793e814e44da2c0 Test plan from GitHub: - run existing tests Reviewed By: DanilBaibak Differential Revision: D37781921 Pulled By: zou3519 fbshipit-source-id: 44e128c7873ada4203753f144ef3780ca4de2e40
…ccept Callable (#81239) (#81239) Summary: Like #81059; this PR addresses the review comments. Pull Request resolved: #81239 Approved by: https://github.com/ezyang Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/9ee312023d8591b1afed19e145cdf61039753a40 Test plan from GitHub: - run tests Reviewed By: DanilBaibak Differential Revision: D37781963 Pulled By: zou3519 fbshipit-source-id: 938a737f44fa93f9b53ae20395c80cd3e89aa549
Stack from ghstack:
Maybe niche, but for one-off debugging purposes, I want a variant of
check_backward_formula that accepts a callable rather than an OpInfo.
This is because when debugging, I try to create a repro that does not
involve OpInfos because OpInfos are difficult to deal with (they have
a lot of sample inputs, I may want to test my own sample inputs without
creating a new OpInfo, etc).
This PR refactors check_backward_formula so that it accepts a Callable
instead of an OpInfo. Example usage:
Test Plan: