Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

: Can't load the configuration of '/cache/vit-large-patch14/' #118

Open
supergangchao opened this issue Jun 26, 2024 · 0 comments
Open

: Can't load the configuration of '/cache/vit-large-patch14/' #118

supergangchao opened this issue Jun 26, 2024 · 0 comments

Comments

@supergangchao
Copy link

(vary) [root@gpu-test-5354 Vary-master]# python vary/demo/run_qwen_vary.py --model-name openai/clip-vit-large-patch14 --image-file ../assets/vary.png /mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/utils/generic.py:260: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. torch.utils._pytree._register_pytree_node( /mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/utils/generic.py:260: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. torch.utils._pytree._register_pytree_node( The argument trust_remote_codeis to be used with Auto classes. It has no effect here and is ignored. You are using a model of type clip to instantiate a model of type vary. This is not supported for all configurations of models and can yield errors. Traceback (most recent call last): File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/configuration_utils.py", line 675, in _get_config_dict resolved_config_file = cached_file( File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/utils/hub.py", line 428, in cached_file resolved_file = hf_hub_download( File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/cache/vit-large-patch14/'. Userepo_type` argument if needed.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/mnt/sdc/Vary/Vary-master/vary/demo/run_qwen_vary.py", line 127, in
eval_model(args)
File "/mnt/sdc/Vary/Vary-master/vary/demo/run_qwen_vary.py", line 43, in eval_model
model = varyQwenForCausalLM.from_pretrained(model_name, low_cpu_mem_usage=True, device_map='cuda', trust_remote_code=True)
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2876, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/vary_qwen_vary.py", line 238, in init
self.transformer = varyQwenModel(config)
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/vary/model/vary_qwen_vary.py", line 48, in init
self.vision_tower = CLIPVisionModel.from_pretrained('/cache/vit-large-patch14/')
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2449, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/models/clip/configuration_clip.py", line 238, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/configuration_utils.py", line 620, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/mnt/sdc/soft/anaconda3/envs/vary/lib/python3.10/site-packages/transformers/configuration_utils.py", line 696, in _get_config_dict
raise EnvironmentError(
OSError: Can't load the configuration of '/cache/vit-large-patch14/'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '/cache/vit-large-patch14/' is the correct path to a directory containing a config.json file`

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant