We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
尊敬的作者你好,我是大模型微调方面的新手,想请教一下chat模型的微调和base模型的微调 是否应该在数据的输入格式上有所区分? chat模型的输入数据格式是对话形式的,这已经在finetune.py的preprocess中体现了 base模型的功能是补全,如果我的需求只是完成一个分类任务,我是否应该将preprocess函数中的 target 修改为我的目标分类,而不是一整段完整的对话
No response
- OS: - Python: - Transformers: - PyTorch: - CUDA (`python -c 'import torch; print(torch.version.cuda)'`):
The text was updated successfully, but these errors were encountered:
No branches or pull requests
是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?
该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?
当前行为 | Current Behavior
尊敬的作者你好,我是大模型微调方面的新手,想请教一下chat模型的微调和base模型的微调 是否应该在数据的输入格式上有所区分?
chat模型的输入数据格式是对话形式的,这已经在finetune.py的preprocess中体现了
base模型的功能是补全,如果我的需求只是完成一个分类任务,我是否应该将preprocess函数中的 target 修改为我的目标分类,而不是一整段完整的对话
期望行为 | Expected Behavior
No response
复现方法 | Steps To Reproduce
No response
运行环境 | Environment
备注 | Anything else?
No response
The text was updated successfully, but these errors were encountered: