-
Notifications
You must be signed in to change notification settings - Fork 969
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Can't find 'adapter_config.json' #1334
Comments
I used a local model loading error |
Experiencing the exact same issue for the past couple of days. I keep digging |
Here’s a link to the older issue that describes the same problem #1283 |
I get the same error. |
No, this is a big trouble |
#1347 should fix this. |
Hum nope it won't. In short here is what triggers the issue the launcher execs
if MODEL_ID_OR_DIR is a local directory, the text-generation-server program will first look for local weights with the specified extensions at the root of the local directory as one can see here
called here if no files are found, many fallbacks in exceptions handling lead to the final error obtained that we should disregard here. so for the call with a local dir to work, two things:
For 1. if one prefetched the model from the hub then one should not specify the model root directory but the specific revsion root directly
will work, but not
because in the hub file layout there is indeed no weight file at the root of the layout I don't know if we should change some behaviour in the code/fix anything, just explaining what I see happening here |
i have same problem, has any one fix this? |
i know the answer, we must convert model from bin to safetensor format |
same issue |
Should be fixed in #1364. Can you please try? |
Same here. #1364 didn't works. |
I solved this issue after converting model from bin to safetensors using |
v1.1.1 works fine, but the latest v1.3.4 is giving the same error. |
Fixed by #1419 |
@OlivierDehaene I am seeing this issue using .safetensors not .bin do you expect your change to fix my issue as well? Also would it be possible to get these local loading changes released? |
System Info
Traceback (most recent call last):
File "/usr/local/lib/python3.9/dist-packages/peft/utils/config.py", line 117, in from_pretrained
config_file = hf_hub_download(
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/data/liuyuanchao/text-generation-inference-main/tiiuae'. Use
repo_type
argument if needed.During handling of the above exception, another exception occurred:
Information
Tasks
Reproduction
File "/usr/local/lib/python3.9/dist-packages/peft/utils/config.py", line 117, in from_pretrained
config_file = hf_hub_download(
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/data/liuyuanchao/text-generation-inference-main/tiiuae'. Use
repo_type
argument if needed.During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/data/text-generation-inference-main/1.py", line 9, in
model = AutoPeftModelForCausalLM.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/peft/auto.py", line 69, in from_pretrained
peft_config = PeftConfig.from_pretrained(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/peft/utils/config.py", line 121, in from_pretrained
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name_or_path}'")
ValueError: Can't find 'adapter_config.json' at '/data/text-generation-inference-main/tiiuae'
Expected behavior
File "/usr/local/lib/python3.9/dist-packages/peft/utils/config.py", line 117, in from_pretrained
config_file = hf_hub_download(
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/usr/local/lib/python3.9/dist-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/data/liuyuanchao/text-generation-inference-main/tiiuae'. Use
repo_type
argument if needed.During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/data/text-generation-inference-main/1.py", line 9, in
model = AutoPeftModelForCausalLM.from_pretrained(
File "/usr/local/lib/python3.9/dist-packages/peft/auto.py", line 69, in from_pretrained
peft_config = PeftConfig.from_pretrained(pretrained_model_name_or_path, **kwargs)
File "/usr/local/lib/python3.9/dist-packages/peft/utils/config.py", line 121, in from_pretrained
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{pretrained_model_name_or_path}'")
ValueError: Can't find 'adapter_config.json' at '/data/text-generation-inference-main/tiiuae'
The text was updated successfully, but these errors were encountered: