Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Added option to use AMP with TF scripts #682

Merged
merged 1 commit into from
Dec 8, 2021
Merged

feat: Added option to use AMP with TF scripts #682

merged 1 commit into from
Dec 8, 2021

Conversation

fg-mindee
Copy link
Contributor

This PR adds support of AMP for TensorFlow training scripts following this documentation: https://www.tensorflow.org/guide/mixed_precision.

Checking within the script, all modules and operations are indeed running the "mixed_float16" policy, but for some reasons, I haven't noticed any RAM saving on GPU 🤷‍♂️

Please note that some TF models still have issues with AMP (cf. tensorflow/tensorflow#53345) but this PR enables it for training generally speaking.

Closes #263

Any feedback is welcome!

@fg-mindee fg-mindee added type: enhancement Improvement ext: references Related to references folder framework: tensorflow Related to TensorFlow backend labels Dec 7, 2021
@fg-mindee fg-mindee added this to the 0.6.0 milestone Dec 7, 2021
@fg-mindee fg-mindee self-assigned this Dec 7, 2021
@codecov
Copy link

codecov bot commented Dec 7, 2021

Codecov Report

Merging #682 (5db608f) into main (d9f432d) will decrease coverage by 0.04%.
The diff coverage is 85.18%.

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #682      +/-   ##
==========================================
- Coverage   96.34%   96.29%   -0.05%     
==========================================
  Files         117      117              
  Lines        4519     4540      +21     
==========================================
+ Hits         4354     4372      +18     
- Misses        165      168       +3     
Flag Coverage Δ
unittests 96.29% <85.18%> (-0.05%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
...dels/detection/differentiable_binarization/base.py 90.00% <ø> (+0.55%) ⬆️
doctr/models/detection/linknet/base.py 87.12% <78.94%> (-1.90%) ⬇️
doctr/models/builder.py 99.12% <100.00%> (+0.01%) ⬆️
doctr/utils/geometry.py 98.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1671cdd...5db608f. Read the comment docs.

Copy link
Collaborator

@charlesmindee charlesmindee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@fg-mindee fg-mindee merged commit c2af6e9 into main Dec 8, 2021
@fg-mindee fg-mindee deleted the tf-amp branch December 8, 2021 08:03
@fg-mindee fg-mindee added type: new feature New feature and removed type: enhancement Improvement labels Dec 31, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ext: references Related to references folder framework: tensorflow Related to TensorFlow backend type: new feature New feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[references] Add FP16 support for training
2 participants