Environment info
transformers version: 3.3.1
- Platform: Linux-3.10.0-1127.el7.x86_64-x86_64-with-centos-7.8.2003-Core
- Python version: 3.7.9
- PyTorch version (GPU?): 1.6.0+cu101 (True)
- Tensorflow version (GPU?): 2.3.0 (True)
- Using GPU in script?:
- Using distributed or parallel set-up in script?:
Question :
I wanted to see the pretrained bert model summary,
So I opened Jupyter notebook on my computer installed Quadro RTX 5000 GPUs ,
and typed the following code to load pretrained bert model using TFBertModel.from_pretrained() function.
After running cell, but I got Error messages...
--- test codes ---
from transformers import TFBertModel
encoder = TFBertModel.from_pretrained('bert-base-uncased')
--- end of test codes ---
---- error messages start ---
OSError Traceback (most recent call last)
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
354 resume_download=resume_download,
--> 355 local_files_only=local_files_only,
356 )
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/file_utils.py in cached_path(url_or_filename, cache_dir, force_download, proxies, resume_download, user_agent, extract_compressed_file, force_extract, local_files_only)
729 # File, but it doesn't exist.
--> 730 raise EnvironmentError("file {} not found".format(url_or_filename))
731 else:
OSError: file bert-base-uncased/config.json not found
During handling of the above exception, another exception occurred:
OSError Traceback (most recent call last)
in
1 from transformers import TFBertModel
2
----> 3 encoder = TFBertModel.from_pretrained('bert-base-uncased')
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/modeling_tf_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
543 proxies=proxies,
544 local_files_only=local_files_only,
--> 545 **kwargs,
546 )
547 else:
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
313
314 """
--> 315 config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
316 return cls.from_dict(config_dict, **kwargs)
317
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
366 f"- or '{pretrained_model_name_or_path}' is the correct path to a directory containing a {CONFIG_NAME} file\n\n"
367 )
--> 368 raise EnvironmentError(msg)
369
370 except json.JSONDecodeError:
OSError: Can't load config for 'bert-base-uncased'. Make sure that:
-
'bert-base-uncased' is a correct model identifier listed on 'https://huggingface.co/models'
-
or 'bert-base-uncased' is the correct path to a directory containing a config.json file
----- end of error messages ---
I also tested above codes in Colab, In Colab, the above code worked well without errors.
Please, let me know how to solve this problem..
Thanks in advance
Environment info
transformersversion: 3.3.1Question :
I wanted to see the pretrained bert model summary,
So I opened Jupyter notebook on my computer installed Quadro RTX 5000 GPUs ,
and typed the following code to load pretrained bert model using TFBertModel.from_pretrained() function.
After running cell, but I got Error messages...
--- test codes ---
from transformers import TFBertModel
encoder = TFBertModel.from_pretrained('bert-base-uncased')
--- end of test codes ---
---- error messages start ---
OSError Traceback (most recent call last)
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
354 resume_download=resume_download,
--> 355 local_files_only=local_files_only,
356 )
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/file_utils.py in cached_path(url_or_filename, cache_dir, force_download, proxies, resume_download, user_agent, extract_compressed_file, force_extract, local_files_only)
729 # File, but it doesn't exist.
--> 730 raise EnvironmentError("file {} not found".format(url_or_filename))
731 else:
OSError: file bert-base-uncased/config.json not found
During handling of the above exception, another exception occurred:
OSError Traceback (most recent call last)
in
1 from transformers import TFBertModel
2
----> 3 encoder = TFBertModel.from_pretrained('bert-base-uncased')
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/modeling_tf_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
543 proxies=proxies,
544 local_files_only=local_files_only,
--> 545 **kwargs,
546 )
547 else:
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
313
314 """
--> 315 config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
316 return cls.from_dict(config_dict, **kwargs)
317
~/anaconda3/envs/tf23/lib/python3.7/site-packages/transformers/configuration_utils.py in get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
366 f"- or '{pretrained_model_name_or_path}' is the correct path to a directory containing a {CONFIG_NAME} file\n\n"
367 )
--> 368 raise EnvironmentError(msg)
369
370 except json.JSONDecodeError:
OSError: Can't load config for 'bert-base-uncased'. Make sure that:
'bert-base-uncased' is a correct model identifier listed on 'https://huggingface.co/models'
or 'bert-base-uncased' is the correct path to a directory containing a config.json file
----- end of error messages ---
I also tested above codes in Colab, In Colab, the above code worked well without errors.
Please, let me know how to solve this problem..
Thanks in advance