Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA BF16 norm #48806

Closed
wants to merge 2 commits into from
Closed

CUDA BF16 norm #48806

wants to merge 2 commits into from

Conversation

zasdfgbnm
Copy link
Collaborator

Fixes #{issue number}

@dr-ci
Copy link

dr-ci bot commented Dec 3, 2020

💊 CI failures summary and remediations

As of commit 219512e (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 3 times.

@codecov
Copy link

codecov bot commented Dec 4, 2020

Codecov Report

Merging #48806 (219512e) into master (1eed54d) will decrease coverage by 0.00%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master   #48806      +/-   ##
==========================================
- Coverage   80.79%   80.79%   -0.01%     
==========================================
  Files        1865     1865              
  Lines      201099   201099              
==========================================
- Hits       162474   162468       -6     
- Misses      38625    38631       +6     

@ngimel
Copy link
Collaborator

ngimel commented Dec 5, 2020

cc @mruberry @kurtamohler for norm testing. From my (admittedly very cursory) look there's no test where norm would be tested for a cross-product of dtypes and p's. I also don't have a good idea of how torch.norm and torch.linalg.norm is tested. With that said, I'm going to land this PR as it does not make things worse, and it adds tests for bfloat16 and p=2

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ngimel has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@mruberry
Copy link
Collaborator

mruberry commented Dec 6, 2020

cc @mruberry @kurtamohler for norm testing. From my (admittedly very cursory) look there's no test where norm would be tested for a cross-product of dtypes and p's. I also don't have a good idea of how torch.norm and torch.linalg.norm is tested. With that said, I'm going to land this PR as it does not make things worse, and it adds tests for bfloat16 and p=2

Thanks, and thank you for the fyi, @ngimel!

@kurtamohler maybe we need to add another test or extend an existing test to enumerate over the dtype kwarg for torch.linalg.norm?

@kurtamohler
Copy link
Collaborator

Sure, I can extend the test_norm_dtype test. Might be best to wait for #48284 to land first, since it changes that test a fair bit

@zasdfgbnm zasdfgbnm deleted the bfloat-norm branch December 7, 2020 08:25
@facebook-github-bot
Copy link
Contributor

@ngimel merged this pull request in a39398b.

@facebook-github-bot
Copy link
Contributor

@ngimel merged this pull request in a39398b.

facebook-github-bot pushed a commit that referenced this pull request Jan 6, 2021
Summary:
Dependency:
#48809 #48807 #48806 #48805 #48801 #44994 #44848

Pull Request resolved: #48810

Reviewed By: mruberry

Differential Revision: D25772955

Pulled By: ngimel

fbshipit-source-id: 353f130eb701f8b338a826d2edaea69e6e644ee9
hwangdeyu pushed a commit to hwangdeyu/pytorch that referenced this pull request Jan 14, 2021
Summary:
Dependency:
pytorch#48809 pytorch#48807 pytorch#48806 pytorch#48805 pytorch#48801 pytorch#44994 pytorch#44848

Pull Request resolved: pytorch#48810

Reviewed By: mruberry

Differential Revision: D25772955

Pulled By: ngimel

fbshipit-source-id: 353f130eb701f8b338a826d2edaea69e6e644ee9
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants