-
Notifications
You must be signed in to change notification settings - Fork 712
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'BertTokenizerFast' object has no attribute '_in_target_context_manager' #718
Comments
Found the issue, the 4.22.x and 4.21.x versions of transformers is returning the same error. Using an older version, transformers==4.20.1, it started working fine again. |
It indeed seems that the environment in which you worked was updated in between sessions. Glad to hear that it worked out! |
How can I change the version of transformers==4.20.1 in your setup ? |
you can force reinstall, as shown below,
or use a virtualenv. |
Hi @MaartenGr , Could you please confirm if this dependency on older version of transformers is fixed in the latest 0.12.0? or my suggestion to include a tabulation on the dependency to transfomers version? |
@Cspellz The issue here is not about a fixed dependency on a specific version but that the model was saved in one environment and loaded in an environment with different versions of the dependencies. This is in general an issue with model persistence which you can read a bit more about here. Whenever you save any model, BERTopic or not, it is important to fix the versions of your environment between loading and saving the model. |
Thank you! I tried your pip command and it resolved my issue! |
Thank you! I also tried to use the new version to save and load the model. It worked well! |
I am using version 0.11.0.
This is a saved model and imported using
BERTopic.load()
, it worked fine last week and the before that. I just tried doing some work on it today and I got this error when I try to use,topic_model.find_topic('...')
.'BertTokenizerFast' object has no attribute '_in_target_context_manager'.
Please and Thank you :)
The text was updated successfully, but these errors were encountered: