-
Notifications
You must be signed in to change notification settings - Fork 789
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
no entry found for key #260
Comments
It was a problem with |
I encountered this issue too due to having a "<n>" in :3 Why? |
I'm also having this issue, is there a way to know what entry is missing? It seems to save the |
This should be fixed with the latest release |
Yes, it's working in |
tl:dr; use Leaving this comment here in case anyone else stumbles upon this issue but the functionality between This is a little confusing because if you want to load a saved tokenizer the tokenizer object expects a merges file and vocab file. I don't see any functionality to load the whole object as saved so it seems the best bet is to use |
If you want to load a model (BPE for example), then But if you want to load the whole tokenizer, then you can do
This file will have been saved with |
I am finetuning a model and for some hyperparameters (different number of epochs or learning rate) I get an error:
Version: 0.5.2
The text was updated successfully, but these errors were encountered: