Compatibility of Plugins with Local LLM (Ollama) #1538
-
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
Hey, From what i read Ollama API does not follow the chatgpt api structure so you would need to setup Ollama in the OpenAI configuration panel using the http://127.0.0.1:11434/v1. Then create the models you have available in Ollama and enable the function calling: Even though some models seem to be able to detect the plugins available (or maybe the plugins are coming from the trained data), it doesn't seem to be able to call the functions. Does anyone know what the structure should be? If i ask to crawl a website or get the weather, the data is outdated or is generated by the model. I can get it to send the request without being formatted, but i'm guessing im missing how the returned schema is converted to an actual API call for the plugin? Or is ollama expected to POST the API call to an endpoint? EDIT: Looking at one of the plugins code on lobe-chat, it looks like these are intended to be called by ChatGPT API in which case it will not work with Ollama. It would need local scripts and something to interpret function call "tags" and associate the call with the script to action it. |
Beta Was this translation helpful? Give feedback.
-
ollama currently don't support function calling. Refs: #1364 (comment) |
Beta Was this translation helpful? Give feedback.
ollama currently don't support function calling.
Refs: #1364 (comment)