New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't load DeBERTa-v3 tokenizer #70
Comments
from transformers import DebertaV2Tokenizer, DebertaV2Model |
Thank you, I was able to initialize tokenizer, but later it gives me an error when providing text to tokenizer |
Hello, the issue was that I used colab and tokenizer needed Thank you sharing the model! |
Gives me an error
ValueError: This tokenizer cannot be instantiated. Please make sure you have
sentencepiece
installed in order to use this tokenizer.But sentencepiece is already installed
Also tried
this gives me
TypeError: stat: path should be string, bytes, os.PathLike or integer, not NoneType
Please help, how can I use the tokenizer for deberta-base-v3?
The text was updated successfully, but these errors were encountered: