-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Can't find config.json at './best_ckpt/' #57
Comments
感觉就是路径的问题,不然用绝对路径试一下? |
在colab上运行的,指明了 /content/best_ckpt/ 还是会出现同样的问题。重新运行了好几次了也没能解决。sad |
好神奇,我没遇到过,下周我们打算重启研究,到时候我看看捏 |
同问,期待解答 @hikariming |
./anaconda3/lib/python3.8/site-packages/deep_training/nlp/models/lora/configuration.py 下面默认一个CONFIG_NAME ="adapter_config.json" 而不是config.json. 我改了,但是编译文件没生效。。 |
chatglm_finetuning原作[ssbuild]大佬给了解答
|
您好,我在用您给的代码进行微调的时候,发现在最后调用模型,用 LoraArguments 读取 /best_ckpt/config.json 文件 的时候,即使相关目录下面存在 config.json 文件,但是最终还是报"ValueError: Can't find config.json at './best_ckpt/'" 的错误:
lora_args = LoraArguments.from_pretrained('./best_ckpt/')
ValueError: Can't find config.json at './best_ckpt/'
不知道是什么原因导致,以下是 config.json 文件的内容,您遇到过这样的问题吗,或者您知道可能是什么原因导致的吗,期待您的回复.
{
"architectures": [
"ChatGLMModel"
],
"auto_map": {
"AutoConfig": "configuration_chatglm.ChatGLMConfig",
"AutoModel": "modeling_chatglm.ChatGLMForConditionalGeneration",
"AutoModelForSeq2SeqLM": "modeling_chatglm.ChatGLMForConditionalGeneration"
},
"bos_token_id": 150004,
"eos_token_id": 150005,
"hidden_size": 4096,
"initializer_range": 0.02,
"initializer_weight": false,
"inner_hidden_size": 16384,
"layernorm_epsilon": 1e-05,
"max_sequence_length": 2048,
"model_type": "chatglm",
"num_attention_heads": 32,
"num_layers": 28,
"pad_token_id": 20003,
"position_encoding_2d": true,
"pre_seq_len": null,
"precision": 16,
"prefix_projection": false,
"quantization_bit": 0,
"return_dict": false,
"task_specific_params": {
"learning_rate": 2e-05,
"learning_rate_for_task": 2e-05
},
"torch_dtype": "float16",
"transformers_version": "4.27.4",
"use_cache": true,
"vocab_size": 150528
}
The text was updated successfully, but these errors were encountered: