Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
FP16 support for GPU tensors in all frameworks (#529)
* Initial support for FP16 Bump version to a dev release Cast vars to fp16 before allreduce to compress gradients Abstracted compression algorithm into a class hierarchy and added algorithm flag to optimizer and allreduce signatures Changed compressor to set the dtype on initialization Resolved conflicts Additional conflicts Formatting More formats Updated license Added fp16 compression for Keras Added arguments to keras examples Fixed imports * Added compression to tf.keras * Added PyTorch compression API Added unit tests Whitespace * Added C interfaces and types * Forward declare * Removed Half from older versions of PyTorch * Added error for old version of PyTorch * Removed reference to float16 * Updated examples, added compression to the Keras model load * Cleaned imports * Removed dependency on enums * Updated unit tests * Test compatability fix * Reverted version updates * Fixed message * Removed imports * Added cuda.HalfTensor to all PyTorch tests with CUDA * Only compare versions once * Renamed --fp16 in examples to --fp16-allreduce for clarity * Replaced assignment with set_ * Modified compression algorithms to be stateless with optional context parameters * Removed optional ctx parameter * Replaced 0.4.2 with 1.0.0 * Only run GPU tests with HalfTensors if fp16 is supported
- Loading branch information
Showing
22 changed files
with
438 additions
and
71 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.