-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Share GPU assumptions #41
Comments
Anything goes, so long as you have 70 GB of VRAM... lol I found this fork to be super useful (which removes a lot of the models that would otherwise just give you an OOM - but at least you can use some while chatting to ChatGPT and having it create images for you!): https://github.com/rupeshs/visual-chatgpt/tree/add-colab-support It's good for toying around with this proof-of-concept, but you either need to pay for some cloud compute deluxe or be living in a small server room to enjoy the whole thing. Good thing they said there will be an API soon "in a few days". Together with GPT-4 rumors turning into "next week", I guess I'll settle playing with what fits in my VRAM and then try to get my hands on the API. :-) |
I did get this running in the end, on 8x NVIDIA A100 40 Gb, but various bugs prevented it from fully working (for one, the masking for inpainting wasn't working, not sure if the fork uses/fixes this model?).
Not quite, I tried running it on 8x NVIDIA Tesla V100, 16 Gb. But got OOM on one of the cards when trying to generate an image. I.e. the cards need to be big enough to run the models allocated to them. ...Looking forward to the multimodal APIs coming soon, as you say. |
looks like this info is now in the readme. |
Hey, it looks like this assumes that there are 8 GPUs available. Are you able to provide a bit more info about that? (I.e. what GPUs do you run this on, and recommend running on?)
(Maybe worth adding some info on this in the readme?)
The text was updated successfully, but these errors were encountered: