New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support latest Transformers and new cache design #69
Conversation
from transformers.utils import WEIGHTS_NAME, WEIGHTS_INDEX_NAME, cached_path, hf_bucket_url path = snapshot_download(
repo_id=model_name,
allow_patterns=["*"],
local_files_only=is_offline_mode(),
cache_dir=os.getenv("TRANSFORMERS_CACHE", None)
) How about something like this ^^? |
Thanks for the suggestion. The implementation I chose is similar to this. If you have a chance, try it out and let me know if you run into any bugs :) |
I have tested this @mrwyattii and it works fine. I would say the newer change you have made is much better and intuitive. Thanks for this PR. |
I would say, after a few versions, we can drop support for older transformers maybe? |
Can we merge this? |
yes, we'll get this merged very soon! sorry for the delay |
No description provided.