-
Notifications
You must be signed in to change notification settings - Fork 949
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error: DriverError(CUDA_ERROR_OUT_OF_MEMORY, "out of memory") with multiple GPU #2046
Comments
Definitely possible but won't just work with standard examples. Please take a look at this example for how to run across multiple GPUs: |
Unfortunately it didn't help
model.rs
main.rs
As I understand it, these errors can disappear if changes are made to the following files:
right? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I have: 4x RTX 3080 = 40GB total memory (each GPU by 10 GB memory)
I try to load model Mistral 7 about 15Gb file.
But I take error:
Is it possible to run on multiple GPU mode?
The text was updated successfully, but these errors were encountered: