-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 432.00 MiB (GPU 2; 23.65 GiB total capacity; 20.88 GiB already allocated; 259.56 MiB free; #49
Comments
You ran out of GPU memory. Describe more on your setup like what you are using and what command you ran to resolve. |
It'd be really cool if the minimum requirements of the model (size on disk for data set, vram requirements) on the readme, that would save a lot of people some time. |
That's a great idea. I'll put up a PR soon to document this. |
(OpenChatKit) root@aca2869c8358:~/OpenChatKit-main# python inference/bot.py |
some problem, any idea how much memory it needs? or any solution to reduce the memory use? Thanks. |
Describe the bug
A clear and concise description of what the bug is.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Smartphone (please complete the following information):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: