-
-
Notifications
You must be signed in to change notification settings - Fork 7.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Request] glm模型怎样使用插件 #1535
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
🥰 Description of requirementsPlease tell me how to solve the problem when using glm-4 model to call plug-in fails. 🧐 SolutionIs it because the interfaces of glm and openai do not match? How should I change it? 📝 Supplementary informationHow should glm-4 use custom plug-ins, and what should the manifest.json file of the plug-in look like? |
This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 0.157.0 🎉 The release is available on: Your semantic-release bot 📦🚀 |
🥰 需求描述
请问用glm-4模型调用插件不成功如何解决呀
🧐 解决方案
是glm和openai的接口不匹配吗,应该怎样改
📝 补充信息
glm-4应该怎样使用自定义插件,插件的manifest.json文件应该是什么样的
The text was updated successfully, but these errors were encountered: