Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Can't find config.json at './best_ckpt/' #57

Open
OneStepAndTwoSteps opened this issue Apr 6, 2023 · 6 comments
Open

ValueError: Can't find config.json at './best_ckpt/' #57

OneStepAndTwoSteps opened this issue Apr 6, 2023 · 6 comments

Comments

@OneStepAndTwoSteps
Copy link

您好,我在用您给的代码进行微调的时候,发现在最后调用模型,用 LoraArguments 读取 /best_ckpt/config.json 文件 的时候,即使相关目录下面存在 config.json 文件,但是最终还是报"ValueError: Can't find config.json at './best_ckpt/'" 的错误:

lora_args = LoraArguments.from_pretrained('./best_ckpt/')
ValueError: Can't find config.json at './best_ckpt/'

不知道是什么原因导致,以下是 config.json 文件的内容,您遇到过这样的问题吗,或者您知道可能是什么原因导致的吗,期待您的回复.

{
"architectures": [
"ChatGLMModel"
],
"auto_map": {
"AutoConfig": "configuration_chatglm.ChatGLMConfig",
"AutoModel": "modeling_chatglm.ChatGLMForConditionalGeneration",
"AutoModelForSeq2SeqLM": "modeling_chatglm.ChatGLMForConditionalGeneration"
},
"bos_token_id": 150004,
"eos_token_id": 150005,
"hidden_size": 4096,
"initializer_range": 0.02,
"initializer_weight": false,
"inner_hidden_size": 16384,
"layernorm_epsilon": 1e-05,
"max_sequence_length": 2048,
"model_type": "chatglm",
"num_attention_heads": 32,
"num_layers": 28,
"pad_token_id": 20003,
"position_encoding_2d": true,
"pre_seq_len": null,
"precision": 16,
"prefix_projection": false,
"quantization_bit": 0,
"return_dict": false,
"task_specific_params": {
"learning_rate": 2e-05,
"learning_rate_for_task": 2e-05
},
"torch_dtype": "float16",
"transformers_version": "4.27.4",
"use_cache": true,
"vocab_size": 150528
}

@hikariming
Copy link
Owner

感觉就是路径的问题,不然用绝对路径试一下?

@OneStepAndTwoSteps
Copy link
Author

在colab上运行的,指明了 /content/best_ckpt/ 还是会出现同样的问题。重新运行了好几次了也没能解决。sad

@hikariming
Copy link
Owner

好神奇,我没遇到过,下周我们打算重启研究,到时候我看看捏

@Nicole1130
Copy link

同问,期待解答 @hikariming

@Shkklt
Copy link

Shkklt commented Apr 10, 2023

./anaconda3/lib/python3.8/site-packages/deep_training/nlp/models/lora/configuration.py 下面默认一个CONFIG_NAME ="adapter_config.json" 而不是config.json. 我改了,但是编译文件没生效。。

@Nicole1130
Copy link

chatglm_finetuning原作[ssbuild]大佬给了解答

————————————————————————————————————————————————
(ssbuild/chatglm_finetuning#149 (comment))
lora 按照loss保存模型已注释, 因此保存的模型路径在 last_ckpt文件夹 , 配置文件在best_ckpt , 复制best_ckpt config.json 到 last_ckpt, 修改infor_lora_finetune 代码路径即可。
有时间会更新下路径
————————————————————————————————————————————————

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants