Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to inference by multi-node and multi-card? #51

Closed
vicwer opened this issue Feb 15, 2023 · 4 comments
Closed

how to inference by multi-node and multi-card? #51

vicwer opened this issue Feb 15, 2023 · 4 comments

Comments

@vicwer
Copy link

vicwer commented Feb 15, 2023

Thank you very much for your work. May I ask how to use multi-node and multi-card to do bloom model inference in deepspeed? I have 4nodes-4v100gpus.

@mayank31398
Copy link
Collaborator

Hi @vicwer
Currently, this repo doesn't support multi-node inference.
But you can look at run 176b in int8 on 4x V100 GPUs. I am not sure if it will work but its worth a try.

@vicwer
Copy link
Author

vicwer commented Feb 15, 2023

@mayank31398 OK, thanks!

@mayank31398
Copy link
Collaborator

closing this :)

@TingchenFu
Copy link

Hi @vicwer do you find any possible way to run the 176b model with V100? How much V100s do you use?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants