Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Which GPUs does RoSA support? #10

Open
Preetika764 opened this issue Jul 5, 2024 · 2 comments
Open

Which GPUs does RoSA support? #10

Preetika764 opened this issue Jul 5, 2024 · 2 comments

Comments

@Preetika764
Copy link
Collaborator

I'm trying to run RoSA finetuning on Nvidia Quadro RTX 6000. The GPU architecture doesn't support bfloat16 so I tried to load the model in 4 bits (similar to the suggestion for Colab T4 GPU). The finetuning is completed but at the time of loading the model and running inference, I get a Runtime Error: No Kernel Image Available for execution. Is there any workaround for this? FFT is not working for me (Running out of GPU RAM).

@MNikdan
Copy link
Member

MNikdan commented Jul 5, 2024

Hi.

Can you please try adding the argument --dtype fp32 to the evaluation commands here and here and see whether the issue is resolved?
As you mentioned, this is probably duo to your GPU not supporting BF16, and currently the default dtype to load the model for evaluation is bf16 (here). You should be able to change this default value locally to fp32 to fix the problem.

A more permanent solution would be to read the default dtype from the MODEL_PRECISION argument in the config file.

@Preetika764
Copy link
Collaborator Author

I tried modifying the dtype in config.sh but I run out of memory sadly for fp32. When I try setting the precision to 4bit, I am able to finetune the model but something goes wrong at the end. The adapters aren't saved and I get the runtime error I described above. I searched for the error and some sites suggest that this happens when the model is trained for a GPU architecture that it is not compatible with.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants