Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChatGLM3-6b测试模型时报错AttributeError: can't set attribute #30594

Closed
2 of 4 tasks
padsasdasd opened this issue May 1, 2024 · 2 comments
Closed
2 of 4 tasks

Comments

@padsasdasd
Copy link

System Info

  • transformers version: 4.36.0
  • Platform: Linux-5.15.0-101-generic-x86_64-with-glibc2.17
  • Python version: 3.8.10
  • Huggingface_hub version: 0.22.2
  • Safetensors version: 0.4.3
  • Accelerate version: 0.29.3
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.0.0+cu118 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

xtuner chat /root/autodl-tmp/add --prompt-template default

Traceback (most recent call last):
File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 491, in
main()
File "/root/ChatGLM3/xtuner/xtuner/tools/chat.py", line 237, in main
tokenizer = AutoTokenizer.from_pretrained(
File "/root/miniconda3/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 774, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2028, in from_pretrained
return cls._from_pretrained(
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 2260, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/add/tokenization_chatglm.py", line 108, in init
super().init(padding_side=padding_side, clean_up_tokenization_spaces=clean_up_tokenization_spaces,
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils.py", line 363, in init
super().init(**kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 1602, in init
super().init(**kwargs)
File "/root/miniconda3/lib/python3.8/site-packages/transformers/tokenization_utils_base.py", line 861, in init
setattr(self, key, value)
AttributeError: can't set attribute

Expected behavior

模型合并已经成功了,测试时罚下一直出现错误,求大佬解答。

@amyeroberts
Copy link
Collaborator

Hi @padsasdasd, thanks for opening an issue.

Could you provide a reproducer, an explanation of what you're trying to do, and the expected behaviour? Without these there isn't really anything we can do to help.

Copy link

github-actions bot commented Jun 1, 2024

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

@github-actions github-actions bot closed this as completed Jun 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants