Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update for PyTorch v1.5 #172

Merged
merged 16 commits into from Apr 21, 2020
Merged

Update for PyTorch v1.5 #172

merged 16 commits into from Apr 21, 2020

Conversation

LaurentMazare
Copy link
Owner

This updates tch-rs to using PyTorch 1.5. As of 2020-04-10, PyTorch 1.5 has not been officially released (some release candidates have been), so this PR will only be merged once this has happened.

@danieldk
Copy link
Contributor

Cool! 👍

It seems that the functions for autocasting in auto mixed precision functions are not exposed?

https://github.com/pytorch/pytorch/blob/925cdd57dc14415c86f79151ad86e392a5b9a4b9/aten/src/ATen/autocast_mode.h

@LaurentMazare
Copy link
Owner Author

Actually I'm not sure this is part of the pytorch v1.5 release: I can't see the file you're pointing at in the release/1.5 branch and I wasn't able to find it somewhere else in the tree.

@danieldk
Copy link
Contributor

danieldk commented Apr 17, 2020

Actually I'm not sure this is part of the pytorch v1.5 release: I can't see the file you're pointing at in the release/1.5 branch and I wasn't able to find it somewhere else in the tree.

Sorry for the noise! I was under the impression that it was in 1.5, since it was merged before the branch-off, but it seems that it was backed out for minor changed and then merged again to master after the branch point. So it probably won't make it into 1.5.0 :(. Guess we'll have to wait some more time (a colleague is training the same transformers with PyTorch + apex and it's approximately twice as fast on some of our GPUs).

pytorch/pytorch#35009 (comment)

@LaurentMazare
Copy link
Owner Author

No worries, feel free to remind me to include this once we reach 1.6 if I forget to do so.

@gchanan
Copy link

gchanan commented Apr 17, 2020

Yes, sorry for the noise, but full AMP support didn't make 1.5. There are some building blocks there, but we are expecting full support in 1.6.

@danieldk
Copy link
Contributor

Yes, sorry for the noise, but full AMP support didn't make 1.5. There are some building blocks there, but we are expecting full support in 1.6.

Awesome 👍 I'll try a nightly in the meanwhile!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants