-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUDA out of memory #42
Comments
limited performance decrease if you are not reducing it extremely; for inference, you can refer to here |
I reduce both limit from 30 to 10 , but still cuda out of memory on ti2080 12G |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Your team has done an excellent job. I would like to know that when I use four NVIDIA RTX 2080 and the batch_size is set to the minimum of 4, the output is always ' CUDA out of memory' when I run it. I would like to know if there are any parameters in the model that can be reduced to solve this problem. Thank you very much.
The text was updated successfully, but these errors were encountered: