-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange CUDA out of memory issue #13
Comments
If it has worked previously then it is indeed strange. Can you try downgrading transformers to 4.28 and/or torch to 2.0? |
Thank you for your response. I have tested this, and it's not a version issue, as I successfully ran the 7B model directly. Here's the comparison between the new versions (transformers 4.41.2 and torch 2.3.1) and the old versions (transformers 4.28 and torch 2.0): New versions: Old versions: The 13B model is about 40GB, and if using a non-A100 GPU, the large chunks of the model have to stay on the CPU, then load in chunks to the GPU. I suspect this step might be encountering an issue? Previously, I was able to run the 13B model on a 16GB GPU. Could it have just been luck? |
Did you maybe load the model with |
Thank you for your reply. I would like to ask about the 'device_map="auto"' variable modification. |
Strange CUDA out of memory issue; my previous program used to run, but now it cannot run both locally and on COLAB.
Code:
pip install 'automatikz[pdf] @ git+https://github.com/potamides/AutomaTikZ'
!git clone https://github.com/potamides/AutomaTikZ
!pip install -e AutomaTikZ[webui]
from automatikz.infer import TikzGenerator, load
generate = TikzGenerator(*load("nllg/tikz-clima-7b"), stream=True)
OutOfMemoryError: CUDA out of memory. Tried to allocate 32.00 MiB. GPU
The text was updated successfully, but these errors were encountered: