Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lora存checkpoint的问题 #13

Closed
toddlt opened this issue Nov 21, 2023 · 2 comments
Closed

lora存checkpoint的问题 #13

toddlt opened this issue Nov 21, 2023 · 2 comments

Comments

@toddlt
Copy link

toddlt commented Nov 21, 2023

请问您有遇到过lora训chatglm3时存下来的checkpoint里是一个12G的pytorch_model.bin而不是几十M的adapter_model.bin的情况吗

@xxw1995
Copy link
Owner

xxw1995 commented Nov 21, 2023

听起来是保存了全量的参数,而不是lora参数,重新定义下save函数

@toddlt
Copy link
Author

toddlt commented Nov 21, 2023

嗯嗯,明白啦,改了管用了

@toddlt toddlt closed this as completed Nov 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants