Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sampling with multiple GPUs? #29

Closed
tmlbl opened this issue May 3, 2020 · 3 comments
Closed

Sampling with multiple GPUs? #29

tmlbl opened this issue May 3, 2020 · 3 comments

Comments

@tmlbl
Copy link

tmlbl commented May 3, 2020

I am running on a machine at home with 2x8GB GPU and the 5b_lyrics model runs out of GPU memory, but it appears to only be using device 0. Is there a way to distribute the sampling across 2 physical GPUs?

@prafullasd
Copy link
Collaborator

We don't currently support it. One way to do so would be to keep half the layers of your model on one GPU, half on another, and spin up the process with two visible GPU's. For eg like here: https://pytorch.org/tutorials/intermediate/model_parallel_tutorial.html

You might have to insert .to("cuda:0/1") calls at the correct places for it to work

@zaptrem
Copy link

zaptrem commented Jul 11, 2020

@tmlbl Did you ever figure out a workable solution?

@johndpope
Copy link
Contributor

johndpope commented Sep 8, 2020

@prafullasd - we need this to have a chance of running 2 cards - to get 48gb of VRAM.
https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

related - #142

Could you please reopen this ticket? as an investigation.
Need to sift through the forks to see if anyone has made any progress.

can use this shell script to find-forks.sh
https://gist.githubusercontent.com/joeytwiddle/abee0610244d283f1ebe2f76bf608086/raw/094ef14c0b83125149698b234e95cb525751f25e/github_get_all_forks.sh

chmod +x find-forks.sh 
/find-forks.sh -do

seems like they exist
Screen Shot 2020-09-08 at 5 15 23 pm

there is actual an official branch / this should be mentioned on readme.
https://github.com/openai/jukebox/tree/multi_gpu_sampling

so did you guys get anywhere with this? I don't have multiple gpus to support

c6617dd

@heewooj - do you know if multiple gpus can access more VRAM? if I buy 2 x 3090 nvidia cards / 24gb - will it give 48gb and allow fine tune of 5 billion parameters?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants