-
Notifications
You must be signed in to change notification settings - Fork 25.6k
Add inputs
argument to autograd.backward()
#46855
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
💊 CI failures summary and remediationsAs of commit 30b3516 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group. This comment has been revised 120 times. |
311cd4e
to
fcf409b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To appease the backward compatibility test as you are willingly changing the signature, you should add it here:
("aten::hash", datetime.date(2020, 11, 15)), |
Codecov Report
@@ Coverage Diff @@
## master #46855 +/- ##
==========================================
- Coverage 68.87% 68.87% -0.01%
==========================================
Files 436 436
Lines 56368 56371 +3
==========================================
+ Hits 38823 38825 +2
- Misses 17545 17546 +1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you are missing the check to ensure that all the given inputs are actually leafs when passed to .backward()
no?
I think that it can be in autograd/autograd.cpp
when you build the output_edges.
inputs
argument to autograd.backward()
inputs
argument to autograd.backward()
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks mostly good. Just minor comments.
…ge_backward_api
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks ok to me can you fix the lint?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@soulitzer has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
// The user either called autograd.backward(...) or autograd.grad(...) to get here | ||
bool backward_api_called = inputs == nullptr; | ||
bool backward_api_called = accumulate_grad; | ||
TORCH_CHECK(!backward_api_called || at::impl::VmapMode::current_vmap_level() == 0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does accumulate_grad
do?
const variable_list& inputs, | ||
bool keep_graph, | ||
bool create_graph, | ||
bool accumulate_grad, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we add a comment here for what accumulate_grad does, for future code readers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Creating a separate PR for this
@soulitzer merged this pull request in f5073b0. |
Summary: Addressing a comment from a PR that has already been merged #46855 #46855 (comment) Pull Request resolved: #47266 Reviewed By: agolynski Differential Revision: D24709017 Pulled By: soulitzer fbshipit-source-id: 3c104c2fef90ffd75951ecef4ae9e938d4b12d8c
Fixes #46373
As noted in #46373, there needs to be a flag passed into the engine that indicates whether it was executed through the backward api or grad api. Tentatively named the flag
accumulate_grad
since functionally, backward api accumulates grad into .grad while grad api captures the grad and returns it.Moving changes not necessary to the python api (cpp, torchscript) to a new PR.