Skip to content

不支持新版的chatglm #14

@YIZXIY

Description

@YIZXIY

更新一下前段时间的chatglm模型文件之后运行报错如下
Traceback (most recent call last):
File "D:\Desktop\CreativeChatGLM-master\app.py", line 18, in
predictor = ChatGLM(model_name)
File "D:\Desktop\CreativeChatGLM-master\predictors\chatglm2.py", line 40, in init
model = ChatGLMForConditionalGeneration.from_pretrained(
File "D:\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 2646, in from_pretrained
) = cls._load_pretrained_model(
File "D:\Python\Python310\lib\site-packages\transformers\modeling_utils.py", line 3019, in _load_pretrained_model
raise RuntimeError(f"Error(s) in loading state_dict for {model.class.name}:\n\t{error_msg}")
RuntimeError: Error(s) in loading state_dict for ChatGLMForConditionalGeneration:
size mismatch for transformer.word_embeddings.weight: copying a param with shape torch.Size([130528, 4096]) from checkpoint, the shape in current model is torch.Size([150528, 4096]).
size mismatch for lm_head.weight: copying a param with shape torch.Size([130528, 4096]) from checkpoint, the shape in current model is torch.Size([150528, 4096]).
You may consider adding ignore_mismatched_sizes=True in the model from_pretrained method.
cuda11.8
python3.10
pytorch2.0.0
windows10

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions