Skip to content

Conversation

@davidberard98
Copy link
Contributor

@davidberard98 davidberard98 commented Jun 4, 2022

Stack from ghstack:

Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented Jun 4, 2022

🔗 Helpful links

✅ No Failures (0 Pending)

As of commit 21611bc (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Jun 4, 2022
davidberard98 added a commit that referenced this pull request Jun 4, 2022
Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

ghstack-source-id: bc2a5cd
Pull Request resolved: #78875
Copy link
Contributor

@eellison eellison left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice !!!

c10::optional<bool> requiresGrad = c10::nullopt;
for (auto& use : diff_graph->output(i)->uses()) {
if (use.user->kind() == prim::profile) {
requiresGrad = getProfileNodeRequiresGrad(use.user);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If a value has multiple uses, and one of the the uses isn't profiled, we don't want to overwrite the existing use's requires_grad.

if (use.user->kind() == prim::profile) {
   if (auto req_grad_use = getProfileNodeRequiresGrad(use.user)) {
      requiresGrad = req_grad_use;
   }
}

if (dg_use.user->kind() == prim::profile) {
requiresGrad = getProfileNodeRequiresGrad(dg_use.user);
if (requiresGrad) {
break;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: we are doing the early break for if (requiresGrad) in this loop but not the loop above

Value* dg_value = dg->inputs()[use.offset];
for (auto& dg_use : dg_value->uses()) {
if (dg_use.user->kind() == prim::profile) {
requiresGrad = getProfileNodeRequiresGrad(dg_use.user);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe also do the same don't overwrite check as above

…h outputs"

Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

[ghstack-poisoned]
davidberard98 added a commit that referenced this pull request Jun 8, 2022
Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

ghstack-source-id: d60714a
Pull Request resolved: #78875
…h outputs"

Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

[ghstack-poisoned]
davidberard98 added a commit that referenced this pull request Jun 9, 2022
Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

ghstack-source-id: 9b338c6
Pull Request resolved: #78875
@davidberard98
Copy link
Contributor Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a merge job. Check the current status here

@github-actions
Copy link
Contributor

Hey @davidberard98.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

facebook-github-bot pushed a commit that referenced this pull request Jun 11, 2022
…78875)

Summary:
Without profiled outputs, autodiff can't tell whether or not the outputs of a DifferentiableGraph should requires_grad. Autodiff would default to requires_grad=True if there was no profiled information, causing autodiff to mark tensors as requires_grad when they shouldn't have. This adds requires_grad info onto the type of the output, if it can be found in later uses of the output.

Adds a test for correct autodiff requires_grad behavior and also a test to make sure the output type is correctly annotated in create_autodiff_subgraphs.

Pull Request resolved: #78875

Approved by: https://github.com/eellison

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/1d2a6c2e94ddda1c1e5bd611796f0775c26ff456

Reviewed By: osalpekar

Differential Revision: D37085552

Pulled By: davidberard98

fbshipit-source-id: 188d8149e85dffb02d98f409cf08110c7fec9c14
@davidberard98
Copy link
Contributor Author

@pytorchbot help

@pytorch-bot
Copy link

pytorch-bot bot commented Jun 12, 2022

PyTorchBot Help

usage: @pytorchbot {merge,revert,rebase,help} ...

command:
  {merge,revert,rebase,help}
    merge               Merge a PR
    revert              Revert a merged PR
    rebase              Rebase a PR
    help                Show help

Merge

usage: @pytorchbot merge [-g] [-f]

optional arguments:
  -g, --green  Merge when all status checks pass.
  -f, --force  Merge without checking anything. ONLY USE THIS FOR CRITICAL
               FAILURES.

Revert

usage: @pytorchbot revert -m MESSAGE
                          [-c {nosignal,ignoredsignal,landrace,weird,ghfirst}]

optional arguments:
  -m MESSAGE, --message MESSAGE
                        The reason you are reverting, will be put in the
                        commit message.
  -c {nosignal,ignoredsignal,landrace,weird,ghfirst}, --classification {nosignal,ignoredsignal,landrace,weird,ghfirst}
                        A machine-friendly classification of the revert
                        reason.

Rebase

usage: @pytorchbot rebase [-s | -b BRANCH]

optional arguments:
  -s, --stable          Rebase to viable/strict
  -b BRANCH, --branch BRANCH
                        Branch you would like to rebase to

For more info, consult the wiki.

@davidberard98
Copy link
Contributor Author

@pytorchbot revert -m "Internal failures were bisected to this change" -c ghfirst

@pytorchmergebot
Copy link
Collaborator

@pytorchbot successfully started a revert job. Check the current status here

pytorchmergebot added a commit that referenced this pull request Jun 12, 2022
…utputs"

This reverts commit 1d2a6c2.

Reverted #78875 on behalf of https://github.com/davidberard98 due to Internal failures were bisected to this change
@facebook-github-bot facebook-github-bot deleted the gh/davidberard98/112/head branch June 13, 2022 14:17
facebook-github-bot pushed a commit that referenced this pull request Jun 15, 2022
…utputs"

Summary:
This reverts commit 1d2a6c2.

Reverted #78875 on behalf of https://github.com/davidberard98 due to Internal failures were bisected to this change

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/5413580f9e6ce1d490d0b137595574befd28d5f4

Reviewed By: osalpekar, yuchenhao

Differential Revision: D37114173

fbshipit-source-id: f1256b2a317ed2b5c8979ef558412494208ab126
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged oncall: jit Add this issue/PR to JIT oncall triage queue Reverted

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants