Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support torch.bfloat16 in hivemind.compression #524

Merged
merged 7 commits into from
Nov 28, 2022

Conversation

borzunov
Copy link
Member

@borzunov borzunov commented Nov 28, 2022

This PR implements bfloat16 support for CompressionType.NONE and CompressionType.BLOCKWISE_8BIT.

This is important for the Petals client, see bigscience-workshop/petals#79

@borzunov borzunov changed the title Fix CompressionType.NONE processing torch.bfloat16 Support torch.bfloat16 in hivemind.compression Nov 28, 2022
@codecov
Copy link

codecov bot commented Nov 28, 2022

Codecov Report

Merging #524 (d0a76a5) into master (8d51b97) will increase coverage by 0.12%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master     #524      +/-   ##
==========================================
+ Coverage   75.89%   76.01%   +0.12%     
==========================================
  Files          81       81              
  Lines        7947     7960      +13     
==========================================
+ Hits         6031     6051      +20     
+ Misses       1916     1909       -7     
Impacted Files Coverage Δ
hivemind/compression/base.py 93.22% <100.00%> (+0.91%) ⬆️
hivemind/compression/quantization.py 94.87% <100.00%> (+0.27%) ⬆️
hivemind/dht/dht.py 61.67% <0.00%> (-1.20%) ⬇️
hivemind/moe/server/server.py 44.26% <0.00%> (+0.54%) ⬆️
hivemind/dht/routing.py 94.11% <0.00%> (+0.58%) ⬆️
hivemind/dht/node.py 90.99% <0.00%> (+0.71%) ⬆️
hivemind/moe/server/runtime.py 70.00% <0.00%> (+0.83%) ⬆️
hivemind/dht/protocol.py 93.15% <0.00%> (+0.91%) ⬆️
hivemind/moe/server/connection_handler.py 47.91% <0.00%> (+1.04%) ⬆️

@borzunov borzunov merged commit 1e4af43 into master Nov 28, 2022
@borzunov borzunov deleted the fix-compressing-bfloat16 branch November 28, 2022 06:45
mryab pushed a commit that referenced this pull request Nov 29, 2022
This PR implements bfloat16 support for `CompressionType.NONE` and `CompressionType.BLOCKWISE_8BIT`.

This is important for the Petals client, see bigscience-workshop/petals#79

(cherry picked from commit 1e4af43)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants