We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
https://github.com/THUDM/ChatGLM3/tree/main/finetune_chatmodel_demo 所采用的 P-Tuning v2 训练的模型bin文件
是否能加载,若不支持,将来是否有计划。
The text was updated successfully, but these errors were encountered:
🤖
很抱歉,我在Langchain-Chatchat的存储库中没有找到关于GLM3的P-Tuning v2微调的相关信息。为了更好地帮助您,我需要更多的信息。您能否提供一些关于您希望实现的具体功能或者您在使用过程中遇到的问题的详细信息?这将有助于我更准确地找到解决方案。
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
Sorry, something went wrong.
不能,因为不是peft结构,chatglm3会重写微调代码,之后他们新的代码调出来的模型就能用了
P-Tuning v2
zRzRzRzRzRzRzR
No branches or pull requests
https://github.com/THUDM/ChatGLM3/tree/main/finetune_chatmodel_demo
所采用的 P-Tuning v2 训练的模型bin文件
是否能加载,若不支持,将来是否有计划。
The text was updated successfully, but these errors were encountered: