-
Notifications
You must be signed in to change notification settings - Fork 98
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
预训练base时的学习率多少 #18
Labels
Comments
我们SFT阶段使用的学习率是1e-4,bs=1K,你可以根据实际情况调整(观察开发集loss等手段)。 |
请问预训练的lr多大呢
…---原始邮件---
发件人: "Yiming ***@***.***>
发送时间: 2024年5月2日(周四) 上午9:51
收件人: ***@***.***>;
抄送: ***@***.******@***.***>;
主题: Re: [ymcui/Chinese-LLaMA-Alpaca-3] 预训练base时的学习率多少 (Issue #18)
我们SFT阶段使用的学习率是1e-4,bs=1K,你可以根据实际情况调整(观察开发集loss等手段)。
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration. |
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
提交前必须检查以下项目
问题类型
模型训练与精调
基础模型
Llama-3-Chinese-8B(基座模型)
操作系统
None
详细描述问题
预训练base时的学习率多少?我在想参考下,设置sft学习率。谢谢
依赖情况(代码类问题务必提供)
运行日志或截图
The text was updated successfully, but these errors were encountered: