Conversation
ead7477 to
61eeb2c
Compare
|
Lovely, can you get the lint issue addressed? |
78f9357 to
621e71f
Compare
621e71f to
0591168
Compare
|
Fixed. By the way the ollamaclient in langchaingo is no more up to date (deprecated options, old single embedding endpoint), I may propose a PR soon to remove the ollama internal package from langchaingo and replace it by: https://pkg.go.dev/github.com/ollama/ollama/api Side note: seems to conflict with at least one other existing example PR: #951 |
|
Is it ready for use now |
|
Just bumping this :) Thanks for your contributions. RE: llama3.1 don't properly use message history with tool results and I don't really know why
Curious. What style of tool call orchestration is supported? Can the LLM invoke more than one tool call in a response? Can it dictate that specific tools be called concurrently for any reason, or is everything synchronous? |
|
so after a few tests, it looks to me that with this PR the model can correctly see the tools, can correctly query them, but do not see the tool call responses. I might not be using it correctly though. |
In this PR I tried to:
Notes:
except for Parameters field where I set the any type https://github.com/ollama/ollama/blob/main/api/types.go#L168 for easier integration
The model properly answer with json tool call format but after at end it seems that llama3.1 don't properly use message history with tool results and I don't really know why. It would be interesting to test with other models.
May close #965
PR Checklist
memory: add interfaces for X, Yorutil: add whizzbang helpers).Fixes #123).golangci-lintchecks.