Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA BF16 backwards #48809

Closed
wants to merge 1 commit into from
Closed

CUDA BF16 backwards #48809

wants to merge 1 commit into from

Conversation

zasdfgbnm
Copy link
Collaborator

Looks like there's no test?

@codecov
Copy link

codecov bot commented Dec 4, 2020

Codecov Report

Merging #48809 (f1cb78b) into master (e7038a7) will decrease coverage by 0.00%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master   #48809      +/-   ##
==========================================
- Coverage   80.78%   80.78%   -0.01%     
==========================================
  Files        1865     1865              
  Lines      201099   201099              
==========================================
- Hits       162466   162464       -2     
- Misses      38633    38635       +2     

@ailzhang ailzhang requested a review from ngimel December 7, 2020 19:18
@ailzhang ailzhang added module: bfloat16 triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Dec 7, 2020
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ngimel has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@mruberry
Copy link
Collaborator

mruberry commented Dec 7, 2020

I think you can update this line, for example:

@facebook-github-bot
Copy link
Contributor

@ngimel merged this pull request in 5533be5.

@zasdfgbnm zasdfgbnm deleted the bf16-activation branch December 8, 2020 10:34
facebook-github-bot pushed a commit that referenced this pull request Jan 6, 2021
Summary:
Dependency:
#48809 #48807 #48806 #48805 #48801 #44994 #44848

Pull Request resolved: #48810

Reviewed By: mruberry

Differential Revision: D25772955

Pulled By: ngimel

fbshipit-source-id: 353f130eb701f8b338a826d2edaea69e6e644ee9
hwangdeyu pushed a commit to hwangdeyu/pytorch that referenced this pull request Jan 14, 2021
Summary:
Dependency:
pytorch#48809 pytorch#48807 pytorch#48806 pytorch#48805 pytorch#48801 pytorch#44994 pytorch#44848

Pull Request resolved: pytorch#48810

Reviewed By: mruberry

Differential Revision: D25772955

Pulled By: ngimel

fbshipit-source-id: 353f130eb701f8b338a826d2edaea69e6e644ee9
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed Merged module: bfloat16 open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants