-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Benchmarks - Keep BatchNorm as fp32 for pytorch cnn models cast to fp16 #322
Conversation
/azp run |
Azure Pipelines successfully started running 3 pipeline(s). |
Codecov Report
@@ Coverage Diff @@
## main #322 +/- ##
==========================================
+ Coverage 88.64% 88.66% +0.01%
==========================================
Files 76 76
Lines 4500 4507 +7
==========================================
+ Hits 3989 3996 +7
Misses 511 511
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
/azp run |
Azure Pipelines successfully started running 3 pipeline(s). |
Description
The BatchNorm operator is not numerically stable in fp16. PyTorch documentation recommends to keep the BN op in fp32 for fp16 AMP models. Refer to https://pytorch.org/docs/stable/amp.html#ops-that-can-autocast-to-float32. Preserving BN in fp32 for superbench more accurately reflects real workloads.