You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using AutoModelForCausalLM, THUDM/cogagent-vqa-hf and load_in_8bit I get this error : self and mat2 must have the same dtype, but got Half and Char
#28856
Closed
2 of 4 tasks
FurkanGozukara opened this issue
Feb 4, 2024
· 4 comments
Hi @FurkanGozukara
This issue is a duplicate of TimDettmers/bitsandbytes#1029 - can you share the full traceback of the error so that I can fix the issue on the Hub?
My gut feeling is that the model is not compatible with bnb-8bit, the model code authors will need to make a slight change to make it work.
You can also post the same issue on the model repo: https://huggingface.co/THUDM/cogagent-vqa-hf/discussions with the full traceback of the issue
System Info
Who can help?
@ArthurZucker
@amyeroberts
@pacman100
@SunMarc
@younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
here the full code and pip freeze
the error:
self and mat2 must have the same dtype, but got Half and Char
there are no visible errors on CMD window this error returns as response
Same code load in 4 bit working
Expected behavior
no error
The text was updated successfully, but these errors were encountered: