Skip to content

Conversation

rgommers
Copy link
Collaborator

@rgommers rgommers commented Aug 21, 2020

  • Add torch._C bindings from torch/csrc/autograd/init.cpp
  • Renamed torch._C.set_grad_enabled to torch._C._set_grad_enabled
    so it doesn't conflict with torch.set_grad_enabled anymore

This is a continuation of gh-38201. All I did was resolve merge conflicts and finish the annotation of _DecoratorContextManager.__call__ that @ezyang started in the first commit.

Reverts commit b5cd3a8, which was only motivated by not having typing_extensions available. (JIT can't be made to understand Literal[False], so keep as is).

ezyang and others added 2 commits August 21, 2020 19:42
- Add torch._C bindings from torch/csrc/autograd/init.cpp
- Renamed torch._C.set_grad_enabled to torch._C._set_grad_enabled
  so it doesn't conflict with torch.set_grad_enabled anymore

Signed-off-by: Edward Z. Yang <ezyang@fb.com>

(cherry picked from commit 551a222)
@rgommers rgommers added the module: typing Related to mypy type annotations label Aug 21, 2020
@rgommers rgommers requested review from ezyang and malfet and removed request for apaszke August 21, 2020 18:47
@albanD
Copy link
Collaborator

albanD commented Aug 21, 2020

Note that this is most likely going to conflict with #41371
cc @SplitInfinity

@dr-ci
Copy link

dr-ci bot commented Aug 21, 2020

💊 CI failures summary and remediations

As of commit 1ce96e3 (more details on the Dr. CI page):


  • 3/3 failures possibly* introduced in this PR
    • 3/3 non-CircleCI failure(s)

Extra GitHub checks: 2 failed


codecov.io: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 8 times.

@rgommers
Copy link
Collaborator Author

Thanks @albanD. From a quick look:

  • Using torch._C._set_grad_enabled (adding the underscore) of this PR is probably preferred over using torch.set_grad_enabled
  • The def __exit__(self, exc_type: Any, ... change in [JIT] Add JIT support for torch.no_grad #41371 looks like an improvement

Conflict looks minor, happy to wait and make changes here as needed after gh-41371 lands.

@ngimel ngimel added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Aug 24, 2020
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@malfet has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@SplitInfinity
Copy link

I am waiting for a final review of my JIT-side changes on #41371 before I can land it.

@SplitInfinity
Copy link

#41371 has landed. Feel free to rebase and merge.

We can now rely on Literal from typing_extensions, but there's no
good way of making the JIT understand a `Literal[False]` return
value here (even when hiding the return type inside `if TYPE_CHECKING`).

So don't revert commit b5cd3a8 after all.
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@malfet has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@pytorch pytorch deleted a comment from codecov bot Aug 28, 2020
@rgommers rgommers deleted the move-autograd-stub branch August 31, 2020 23:17
@facebook-github-bot
Copy link
Contributor

@malfet merged this pull request in 4c19a1e.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: typing Related to mypy type annotations open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants