We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
baichuan2 可以继续使用 llama-efficient-tunning 工程微调吗?
The text was updated successfully, but these errors were encountered:
可以使用 https://github.com/hiyouga/LLaMA-Efficient-Tuning#supported-models
Sorry, something went wrong.
使用 llama-efficient-tunning 工程微调时loss不下降,同样的数据在 baichuan1代上loss下降的很好,有什么要注意的地方吗
@tumanshu 更新代码
No branches or pull requests
baichuan2 可以继续使用 llama-efficient-tunning 工程微调吗?
The text was updated successfully, but these errors were encountered: