-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Latest huggingface transformers version breaking nlp modules #9272
Comments
Thanks for informing us of this ! Pr #9261 was just merged into 2.0 RC 1 branch. For main, we will try to either patch it or also pin HF temporarily while we cleanup the NLP domain |
Thanks @titu1994. Should the issue not stay open until main gets fixed though? |
pr to main is here: #9273, we'll merge once ci passes |
Oh it auto closed due to pr link, opening it again |
PR to main is merged. |
Still broken... |
still broken |
Works for me with |
@mbugert thanks bro it works for me with. bro can you give me some resources about rag implementation with nemo. we are not able to find |
Describe the bug
The latest version of transformers (4.41.0) breaks MegatronGPTModel. That is because of this commit which removes the
ALBERT_PRETRAINED_MODEL_ARCHIVE_LIST
constant which is used by thehuggingface_utils.py
module here. Thishuggingface_utils.py
is an indirect dependency of nemo/collections/nlp.Steps/Code to reproduce bug
Output:
Expected behavior
Importing MegatronGPTModel (or other classes from these modules) should not fail.
Environment details
5.10.216-182.855.amzn2int.x86_64
2.3.0+cu121
Python 3.12.3
The text was updated successfully, but these errors were encountered: