Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove autograd copy_ specific isFloatingPoint #28279

Closed
wants to merge 5 commits into from

Conversation

@t-vi
Copy link
Collaborator

t-vi commented Oct 18, 2019

Remove autograd copy_ specific isFloatingPoint and use
c10's isFloatingType (and isComplexType).
Before this, .to or .copy_ would drop requires_grad for bfloat16
as the floating types were only considered to be double, float,
and half.

@t-vi t-vi requested a review from apaszke as a code owner Oct 18, 2019
test/test_autograd.py Outdated Show resolved Hide resolved
test/test_autograd.py Outdated Show resolved Hide resolved
@t-vi

This comment has been minimized.

Copy link
Collaborator Author

t-vi commented Oct 19, 2019

So from what I can see, the two CI failures are not really me, so I am reasonably happy about the patch. Thank you for all the feedback.

@t-vi

This comment has been minimized.

Copy link
Collaborator Author

t-vi commented Oct 23, 2019

So I don't want to be pushy, but given that we've had all the discussion after the approval:
Is this OK to merge?
I do believe that it's a strict improvement even if we keep the status quo for complex dtypes and might revisit them in the future.

@SsnL

This comment has been minimized.

Copy link
Collaborator

SsnL commented Oct 28, 2019

I am all for merging this! If there is impression that I was blocking, I am sorry!

Copy link
Contributor

facebook-github-bot left a comment

@izdeby has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@izdeby

This comment has been minimized.

Copy link
Contributor

izdeby commented Oct 28, 2019

@t-vi, thanks for fixing this! I will merge it.

@izdeby

This comment has been minimized.

Copy link
Contributor

izdeby commented Oct 29, 2019

@pytorchbot rebase this please

t-vi added 5 commits Oct 18, 2019
And use c10's isFloatingType (and isComplexType).
Before this, .to or .copy_ would drop requires_grad for bfloat16
as the floating types were only considered to be double, float,
and half.
@izdeby izdeby force-pushed the t-vi:isFloatingPoint branch from 2cf704c to 712733f Oct 29, 2019
Copy link
Contributor

facebook-github-bot left a comment

@izdeby has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot

This comment has been minimized.

Copy link
Contributor

facebook-github-bot commented Oct 29, 2019

@izdeby merged this pull request in e57a119.

v0dro added a commit to v0dro/pytorch that referenced this pull request Nov 25, 2019
Summary:
Remove autograd copy_ specific isFloatingPoint and use
c10's isFloatingType (and isComplexType).
Before this, .to or .copy_ would drop requires_grad for bfloat16
as the floating types were only considered to be double, float,
and half.
Pull Request resolved: pytorch#28279

Differential Revision: D18176084

Pulled By: izdeby

fbshipit-source-id: 8a005a6105e4a827be5c8163135e693a7daae4f4
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.