You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @vicwer
Currently, this repo doesn't support multi-node inference.
But you can look at run 176b in int8 on 4x V100 GPUs. I am not sure if it will work but its worth a try.
Thank you very much for your work. May I ask how to use multi-node and multi-card to do bloom model inference in deepspeed? I have 4nodes-4v100gpus.
The text was updated successfully, but these errors were encountered: