[references] Add FP16 support for training #263
Labels
ext: references
Related to references folder
framework: pytorch
Related to PyTorch backend
framework: tensorflow
Related to TensorFlow backend
help wanted
Extra attention is needed
Milestone
The reference training script should have an option to switch from a FP32 training to a FP16 training.
It would raise a question: how to harmonize model loading from both FP?
A few suggestions:
The second option would be cleaner and avoid unnecessary large checkpoints when they can be kept in FP16
Here is a proposition:
doctr.datasets
(feat: Added FP16 support to doctr.datasets #367)doctr.models
(feat: Added FP16 support for doctr.models #382)doctr.transforms
(feat: Added support of fp16 to doctr.transforms #388)references
with PyTorch (feat: Added support of AMP to all PyTorch training scripts #604)references
with TensorFlow (feat: Added option to use AMP with TF scripts #682)The text was updated successfully, but these errors were encountered: