Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

求助:无法使用GPU推理且显示:assert vocab_file is not None AssertionError #78

Open
LIUBINfighter opened this issue Aug 13, 2023 · 0 comments

Comments

@LIUBINfighter
Copy link

控制台显示如下
F:\edgeDownload\ChatGLM-webui-main\ChatGLM-webui-main>python webui.py
CUDA is not available, using cpu mode...
Traceback (most recent call last):
File "F:\edgeDownload\ChatGLM-webui-main\ChatGLM-webui-main\webui.py", line 57, in
init()
File "F:\edgeDownload\ChatGLM-webui-main\ChatGLM-webui-main\webui.py", line 28, in init
load_model()
File "F:\edgeDownload\ChatGLM-webui-main\ChatGLM-webui-main\modules\model.py", line 66, in load_model
tokenizer = AutoTokenizer.from_pretrained(cmd_opts.model_path, trust_remote_code=True)
File "C:\Users\14651\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "C:\Users\14651\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\tokenization_utils_base.py", line 1841, in from_pretrained
return cls._from_pretrained(
File "C:\Users\14651\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\tokenization_utils_base.py", line 2004, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "C:\Users\14651/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\619e736c6d4cd139840579c5482063b75bed5666\tokenization_chatglm.py", line 221, in init
self.sp_tokenizer = SPTokenizer(vocab_file, num_image_tokens=num_image_tokens)
File "C:\Users\14651/.cache\huggingface\modules\transformers_modules\THUDM\chatglm-6b\619e736c6d4cd139840579c5482063b75bed5666\tokenization_chatglm.py", line 58, in init
assert vocab_file is not None
AssertionError

win11系统,N卡,按照秋叶大佬的界面一步步下载来的。求助,十分感谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant