Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[reland] Add batched grad testing to gradcheck, turn it on in test_autograd #50592

Closed
wants to merge 2 commits into from

Conversation

zou3519
Copy link
Contributor

@zou3519 zou3519 commented Jan 15, 2021

Stack from ghstack:

This adds a check_batched_grad=False option to gradcheck and gradgradcheck.
It defaults to False because gradcheck is a public API and I don't want
to break any existing non-pytorch users of gradcheck.
This:

  • runs grad twice with two grad outputs, a & b
  • runs a vmapped grad with torch.stack([a, b])
  • compares the results of the above against each other.

Furthermore:

  • check_batched_grad=True is set to be the default for
    gradcheck/gradgradcheck inside of test_autograd.py. This is done by
    reassigning to the gradcheck object inside test_autograd
  • I manually added check_batched_grad=False to gradcheck instances
    that don't support batched grad.
  • I added a denylist for operations that don't support batched grad.

Question:

  • Should we have a testing only gradcheck (e.g.,
    torch.testing.gradcheck) that has different defaults from our public
    API, torch.autograd.gradcheck?

Future:

  • The future plan for this is to repeat the above for test_nn.py (the
    autogenerated test will require a denylist)
  • Finally, we can repeat the above for all pytorch test files that use
    gradcheck.

Test Plan:

  • run tests

Differential Revision: D25925942

This adds a `check_batched_grad=False` option to gradcheck and gradgradcheck.
It defaults to False because gradcheck is a public API and I don't want
to break any existing non-pytorch users of gradcheck.
This:
- runs grad twice with two grad outputs, a & b
- runs a vmapped grad with torch.stack([a, b])
- compares the results of the above against each other.

Furthermore:
- `check_batched_grad=True` is set to be the default for
gradcheck/gradgradcheck inside of test_autograd.py. This is done by
reassigning to the gradcheck object inside test_autograd
- I manually added `check_batched_grad=False` to gradcheck instances
that don't support batched grad.
- I added a denylist for operations that don't support batched grad.

Question:
- Should we have a testing only gradcheck (e.g.,
torch.testing.gradcheck) that has different defaults from our public
API, torch.autograd.gradcheck?

Future:
- The future plan for this is to repeat the above for test_nn.py (the
autogenerated test will require a denylist)
- Finally, we can repeat the above for all pytorch test files that use
gradcheck.

Test Plan:
- run tests

[ghstack-poisoned]
@zou3519 zou3519 requested a review from albanD as a code owner January 15, 2021 17:20
zou3519 added a commit that referenced this pull request Jan 15, 2021
This adds a `check_batched_grad=False` option to gradcheck and gradgradcheck.
It defaults to False because gradcheck is a public API and I don't want
to break any existing non-pytorch users of gradcheck.
This:
- runs grad twice with two grad outputs, a & b
- runs a vmapped grad with torch.stack([a, b])
- compares the results of the above against each other.

Furthermore:
- `check_batched_grad=True` is set to be the default for
gradcheck/gradgradcheck inside of test_autograd.py. This is done by
reassigning to the gradcheck object inside test_autograd
- I manually added `check_batched_grad=False` to gradcheck instances
that don't support batched grad.
- I added a denylist for operations that don't support batched grad.

Question:
- Should we have a testing only gradcheck (e.g.,
torch.testing.gradcheck) that has different defaults from our public
API, torch.autograd.gradcheck?

Future:
- The future plan for this is to repeat the above for test_nn.py (the
autogenerated test will require a denylist)
- Finally, we can repeat the above for all pytorch test files that use
gradcheck.

Test Plan:
- run tests

ghstack-source-id: 76ff57c80ed03b241dc950e553d3abb643c99705
Pull Request resolved: #50592
@zou3519
Copy link
Contributor Author

zou3519 commented Jan 15, 2021

Note to reviewers: this is a re-land of #49120, which broke a slow test on master. This reland turns off batch grad testing for that specific test

@zou3519 zou3519 changed the title Add batched grad testing to gradcheck, turn it on in test_autograd [reland] Add batched grad testing to gradcheck, turn it on in test_autograd Jan 15, 2021
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

… in test_autograd"

This adds a `check_batched_grad=False` option to gradcheck and gradgradcheck.
It defaults to False because gradcheck is a public API and I don't want
to break any existing non-pytorch users of gradcheck.
This:
- runs grad twice with two grad outputs, a & b
- runs a vmapped grad with torch.stack([a, b])
- compares the results of the above against each other.

Furthermore:
- `check_batched_grad=True` is set to be the default for
gradcheck/gradgradcheck inside of test_autograd.py. This is done by
reassigning to the gradcheck object inside test_autograd
- I manually added `check_batched_grad=False` to gradcheck instances
that don't support batched grad.
- I added a denylist for operations that don't support batched grad.

Question:
- Should we have a testing only gradcheck (e.g.,
torch.testing.gradcheck) that has different defaults from our public
API, torch.autograd.gradcheck?

Future:
- The future plan for this is to repeat the above for test_nn.py (the
autogenerated test will require a denylist)
- Finally, we can repeat the above for all pytorch test files that use
gradcheck.

Test Plan:
- run tests

Differential Revision: [D25925942](https://our.internmc.facebook.com/intern/diff/D25925942)

[ghstack-poisoned]
zou3519 added a commit that referenced this pull request Jan 18, 2021
This adds a `check_batched_grad=False` option to gradcheck and gradgradcheck.
It defaults to False because gradcheck is a public API and I don't want
to break any existing non-pytorch users of gradcheck.
This:
- runs grad twice with two grad outputs, a & b
- runs a vmapped grad with torch.stack([a, b])
- compares the results of the above against each other.

Furthermore:
- `check_batched_grad=True` is set to be the default for
gradcheck/gradgradcheck inside of test_autograd.py. This is done by
reassigning to the gradcheck object inside test_autograd
- I manually added `check_batched_grad=False` to gradcheck instances
that don't support batched grad.
- I added a denylist for operations that don't support batched grad.

Question:
- Should we have a testing only gradcheck (e.g.,
torch.testing.gradcheck) that has different defaults from our public
API, torch.autograd.gradcheck?

Future:
- The future plan for this is to repeat the above for test_nn.py (the
autogenerated test will require a denylist)
- Finally, we can repeat the above for all pytorch test files that use
gradcheck.

Test Plan:
- run tests

ghstack-source-id: f15a411485debbe239626cf31e476c43b0375e8d
Pull Request resolved: #50592
@codecov
Copy link

codecov bot commented Jan 19, 2021

Codecov Report

Merging #50592 (55d7811) into gh/zou3519/339/base (7f3a407) will increase coverage by 0.00%.
The diff coverage is 92.30%.

@@                 Coverage Diff                  @@
##           gh/zou3519/339/base   #50592   +/-   ##
====================================================
  Coverage                80.66%   80.66%           
====================================================
  Files                     1913     1913           
  Lines                   208058   208083   +25     
====================================================
+ Hits                    167833   167854   +21     
- Misses                   40225    40229    +4     

@facebook-github-bot
Copy link
Contributor

@zou3519 merged this pull request in f7a8bfd.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants