You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when run the code 'tokenizer = AutoTokenizer.from_pretrained("Rostlab/prot_albert", do_lower_case=False )'
report errors as the follow:
Downloading: 100%|█████████████████████████████████████████████████████████████████| 505/505 [00:00<00:00, 516kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████| 238k/238k [00:03<00:00, 77.0kB/s]
Traceback (most recent call last):
File "", line 1, in
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 385, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1768, in from_pretrained
return cls._from_pretrained(
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1841, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/models/albert/tokenization_albert_fast.py", line 136, in init
super().init(
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 89, in init
fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 659, in convert_slow_tokenizer
return converter_class(transformer_tokenizer).converted()
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 349, in converted
tokenizer = self.tokenizer(self.proto)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 335, in tokenizer
raise Exception(
Exception: You're trying to run a Unigram model but you're file was trained with a different algorithm
Expected behavior
The text was updated successfully, but these errors were encountered:
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Environment info
transformers
version: 4.2.2Who can help
Information
Model I am using (Bert, XLNet ...):
The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
Downloading: 100%|█████████████████████████████████████████████████████████████████| 505/505 [00:00<00:00, 516kB/s]
Downloading: 100%|██████████████████████████████████████████████████████████████| 238k/238k [00:03<00:00, 77.0kB/s]
Traceback (most recent call last):
File "", line 1, in
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 385, in from_pretrained
return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1768, in from_pretrained
return cls._from_pretrained(
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1841, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/models/albert/tokenization_albert_fast.py", line 136, in init
super().init(
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/tokenization_utils_fast.py", line 89, in init
fast_tokenizer = convert_slow_tokenizer(slow_tokenizer)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 659, in convert_slow_tokenizer
return converter_class(transformer_tokenizer).converted()
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 349, in converted
tokenizer = self.tokenizer(self.proto)
File "/home/anaconda3/envs/prottrans/lib/python3.8/site-packages/transformers/convert_slow_tokenizer.py", line 335, in tokenizer
raise Exception(
Exception: You're trying to run a
Unigram
model but you're file was trained with a different algorithmExpected behavior
The text was updated successfully, but these errors were encountered: