Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

baichuan2 可以继续使用 llama-efficient-tunning 工程微调吗? #45

Open
cxj01 opened this issue Sep 7, 2023 · 3 comments
Open

Comments

@cxj01
Copy link

cxj01 commented Sep 7, 2023

baichuan2 可以继续使用 llama-efficient-tunning 工程微调吗?

@hiyouga
Copy link

hiyouga commented Sep 7, 2023

@tumanshu
Copy link

使用 llama-efficient-tunning 工程微调时loss不下降,同样的数据在 baichuan1代上loss下降的很好,有什么要注意的地方吗

@hiyouga
Copy link

hiyouga commented Sep 11, 2023

@tumanshu 更新代码

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants