Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

无法使用peftmodel #3

Closed
MZGuo111 opened this issue Jul 19, 2023 · 1 comment
Closed

无法使用peftmodel #3

MZGuo111 opened this issue Jul 19, 2023 · 1 comment

Comments

@MZGuo111
Copy link

实在不好意思,我没有发现讨论区,只能占用您issue区的资源了,请见谅!
我尝试在本地部署huanhuanchat,我成功调用了chatglm,但是无法使用微调后的模型,我的代码如下:

from peft import PeftModel
from transformers import AutoTokenizer, AutoModel

model_path = "THUDM/chatglm2-6b"
model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda()
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
# 到这里都没有问题


#   给你的模型加上嬛嬛LoRA! 
model = PeftModel.from_pretrained(model, "output/sft").half()
model.eval()

报错信息如下:

HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/output/sft/resolve/main/adapter_config.json

The above exception was the direct cause of the following exception:

RepositoryNotFoundError                   Traceback (most recent call last)
    176 try:
--> 177     config_file = hf_hub_download(
    178         model_id,
    179         CONFIG_NAME,
    180         **hf_hub_download_kwargs,
    181     )
    182 except Exception:

File [~/anaconda3/envs/deeplearning/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py:120](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a225461726765744d616368696e65227d.vscode-resource.vscode-cdn.net/home/mguo/dl/llm/~/anaconda3/envs/deeplearning/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py:120), in validate_hf_hub_args.._inner_fn(*args, **kwargs)
    118     kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
--> 120 return fn(*args, **kwargs)

File [~/anaconda3/envs/deeplearning/lib/python3.8/site-packages/huggingface_hub/file_download.py:1195](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a225461726765744d616368696e65227d.vscode-resource.vscode-cdn.net/home/mguo/dl/llm/~/anaconda3/envs/deeplearning/lib/python3.8/site-packages/huggingface_hub/file_download.py:1195), in hf_hub_download(repo_id, filename, subfolder, repo_type, revision, library_name, library_version, cache_dir, local_dir, local_dir_use_symlinks, user_agent, force_download, force_filename, proxies, etag_timeout, resume_download, token, local_files_only, legacy_cache_layout)
...
--> 183         raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
    185 loaded_attributes = cls.from_json_file(config_file)
    186 return loaded_attributes["peft_type"]

ValueError: Can't find 'adapter_config.json' at 'output/sft'

请问这个如何解决呢?

@KMnO4-zx
Copy link
Owner

KMnO4-zx commented Jul 19, 2023

请查看output/sft目录下是否存在这两个文件,以及文件是否下载完整,可以尝试单独下载这两个文件,以及请问您是在根目录下运行该文件的嘛,chatglm2-6b模型需要下载到本地,然后将THUDM/chatglm2-6b路径更换为下载好的chatglm'模型路径。期待你的成功调用~
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants