-
-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Request] Ollama Functions 的计划 #2311
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
没有计划,我们不准备用 LangChain。计划等 Ollama 自行实行Function后我们再接入 |
This issue is closed, If you have any questions, you can comment and reply. |
There is no plan and we are not going to use LangChain. We plan to wait until Ollama implements Function on its own before we can connect to it. |
ollama 已经支持 function calling 了 |
@hl1221hl 有文档吗 |
@hl1221hl Is there any documentation? |
|
@arvinxx 有计划不? 魔搭社区也有个 Ollama + Qwen2 的 Function calling 的文章,https://mp.weixin.qq.com/s/d82jUnXldJw_UPVPngZjDQ |
@BrandonStudio 没有吧,当时 anothropic 就支持了流式 tools 的。 非 stream 模式转成 stream 模式不复杂的,但主要的问题是必须要先在流式模式下存在对应的返回数据结构才能将非流式转成流式。 Groq 做了这一层转换,是因为 groq 的api 对应了 openai 的结构,所以将groq 的非流式tools 转成openai 的流式tools 就好了。 而ollama这边不能这么做。原因是 Ollama 本身的返回的流式数据结构是和 openai 不一致的。目前 ollama 只有文本的流式返回结构,没有定义 tools 的。这就导致如果现在要将ollama非流式的tools转成流式,就要我自定义一个 ollama 的 tools 的流式结构,而这个结构大概率和未来ollama 官方的流式 tools 对不上,那就又有迁移成本。 |
@BrandonStudio No, anothropic already supported streaming tools at that time. Converting non-stream mode to stream mode is not complicated, but the main problem is that the corresponding return data structure must first exist in streaming mode before non-streaming can be converted to streaming. Groq has done this level of conversion because groq's API corresponds to the structure of openai, so it is enough to convert groq's non-streaming tools into openai's streaming tools. But ollama can't do this. The reason is that the streaming data structure returned by Ollama itself is inconsistent with openai. Currently, ollama only has a text stream return structure and does not define tools. This means that if I want to convert ollama's non-streaming tools into streaming, I need to customize a streaming structure of ollama's tools, and this structure will most likely not match the official streaming tools, so There are migration costs again. |
我是在考虑我那个Cloudflare,流式和非流式结构不一样,非流式的是JSON好弄一点 顺便一提那个PR啥时候可以review( |
I'm thinking about my Cloudflare. The streaming and non-streaming ones have different structures. The non-streaming one is JSON which is easier to use. By the way, when will the PR be available for review ( |
@BrandonStudio 最近积攒的 PR,都等知识库/文件上传发布后一并 review。大招快憋出来了😂 |
The PRs that @BrandonStudio has accumulated recently will be reviewed together after the knowledge base/files are uploaded and published. I’m almost holding my big move out 😂 |
🥰 需求描述
目前只支持了 OpenAI 的 Function Calling,期望本地模型能够支持 Function Calling
🧐 解决方案
https://js.langchain.com/docs/integrations/chat/ollama_functions
是否可以用 Langchain 中的 Ollama Functions 完成功能开发?
📝 补充信息
No response
The text was updated successfully, but these errors were encountered: