Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

run generate.py, download Lora model config failed #11

Closed
wydream opened this issue Mar 29, 2023 · 1 comment
Closed

run generate.py, download Lora model config failed #11

wydream opened this issue Mar 29, 2023 · 1 comment

Comments

@wydream
Copy link

wydream commented Mar 29, 2023

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 259, in hf_raise_for_status
    response.raise_for_status()
  File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1021, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/saved-alpaca-belle-cot7b/resolve/main/adapter_config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/peft/utils/config.py", line 99, in from_pretrained
    config_file = hf_hub_download(pretrained_model_name_or_path, CONFIG_NAME)
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 120, in _inner_fn
    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1134, in hf_hub_download
    metadata = get_hf_file_metadata(
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 120, in _inner_fn
    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1475, in get_hf_file_metadata
    hf_raise_for_status(r)
  File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 291, in hf_raise_for_status
    raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-6423b5fb-3dc6880b29a2556c43cb8c3d)

Repository Not Found for url: https://huggingface.co/saved-alpaca-belle-cot7b/resolve/main/adapter_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/root/Alpaca-CoT/generate.py", line 47, in <module>
    model = PeftModel.from_pretrained(
  File "/usr/local/lib/python3.10/site-packages/peft/peft_model.py", line 135, in from_pretrained
    config = PEFT_TYPE_TO_CONFIG_MAPPING[PeftConfig.from_pretrained(model_id).peft_type].from_pretrained(model_id)
  File "/usr/local/lib/python3.10/site-packages/peft/utils/config.py", line 101, in from_pretrained
    raise ValueError(f"Can't find config.json at '{pretrained_model_name_or_path}'")
ValueError: Can't find config.json at 'saved-alpaca-belle-cot7b'

I think the LORA_WEIGHTS missing repo info

@PhoebusSi
Copy link
Owner

Please post more code details.
I imagine this is because you either passed the wrong local LoRA weights path in LORA_WEIGHTS or changed the BASE_MODEL setting.
Please confirm that passing a local path in LORA_WEIGHTS , and passing 'decapoda-research/llama-7b-hf' in BASE_MODEL .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants