You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello there @watashiwa-toki !
Found your issue tangentially while digging into this project. :)
What I think is happening here is that that AutoModelWithLMHead doesn´t support the config
You can load this model using this code
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("decapoda-research/llama-13b-hf")
model = AutoModelForCausalLM.from_pretrained("decapoda-research/llama-13b-hf")
You can check it here , in the UI </> Use in transformers .
I quick check it and there seems to be an error in the LLamaTokenizer, as it should be LlamaTokenizer . See issue
Please note that all these things come more from the HugginFace Hub model owner management, and not from this project...However HF seem quite supportive, you might want to take it from here!
Interesting project this one, right ?
Have a nice day!
👍
I'm try to do this in CoLab:
and get errors:
Is it possible at all? Or i simple do it wrong?
The text was updated successfully, but these errors were encountered: