-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to Download GPT2-large Model from Hub #30715
Comments
Hi @daskol |
Yes, |
Thanks @daskol ! >>> from transformers import AutoModelForCausalLM
>>> model = AutoModelForCausalLM.from_pretrained("gpt2-large", force_download=True) It seems the model is being downloaded correctly, |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
System Info
py312, transformers 4.40.2, huggingface-hub 0.23.0.
Who can help?
@ArthurZucker and @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
It seems that
tranformers
failed to resolve branch and tried to findconfig.json
in non-existing branchNone
.Expected behavior
The script above has to download and load model.
The text was updated successfully, but these errors were encountered: