Skip to content

ollama: add tools support#1022

Open
treywelsh wants to merge 3 commits intotmc:mainfrom
treywelsh:ollama_tool_support
Open

ollama: add tools support#1022
treywelsh wants to merge 3 commits intotmc:mainfrom
treywelsh:ollama_tool_support

Conversation

@treywelsh
Copy link
Copy Markdown
Contributor

@treywelsh treywelsh commented Sep 13, 2024

In this PR I tried to:

  • basically integrate Ollama tools supports in langchaingo
  • add an ollama test
  • update the ollama-functions-example code now heavily inspired of openai-function-call-example

Notes:

The model properly answer with json tool call format but after at end it seems that llama3.1 don't properly use message history with tool results and I don't really know why. It would be interesting to test with other models.

May close #965

PR Checklist

  • Read the Contributing documentation.
  • Read the Code of conduct documentation.
  • Name your Pull Request title clearly, concisely, and prefixed with the name of the primarily affected package you changed according to Good commit messages (such as memory: add interfaces for X, Y or util: add whizzbang helpers).
  • Check that there isn't already a PR that solves the problem the same way to avoid creating a duplicate.
  • Provide a description in this PR that addresses what the PR is solving, or reference the issue that it solves (e.g. Fixes #123).
  • Describes the source of new concepts.
  • References existing implementations as appropriate.
  • Contains test coverage for new functions.
  • Passes all golangci-lint checks.

@treywelsh treywelsh changed the title Ollama tool support ollama: add tools support Sep 13, 2024
@treywelsh treywelsh force-pushed the ollama_tool_support branch 2 times, most recently from ead7477 to 61eeb2c Compare September 13, 2024 16:30
@tmc
Copy link
Copy Markdown
Owner

tmc commented Sep 13, 2024

Lovely, can you get the lint issue addressed?

  Error: llms/ollama/ollamallm.go:49:1: cognitive complexity 36 of func `(*LLM).GenerateContent` is high (> 30) (gocognit)
  func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error) { // nolint: lll, cyclop, funlen

@treywelsh treywelsh force-pushed the ollama_tool_support branch 2 times, most recently from 78f9357 to 621e71f Compare September 14, 2024 19:32
@treywelsh
Copy link
Copy Markdown
Contributor Author

treywelsh commented Sep 14, 2024

Fixed.

By the way the ollamaclient in langchaingo is no more up to date (deprecated options, old single embedding endpoint), I may propose a PR soon to remove the ollama internal package from langchaingo and replace it by: https://pkg.go.dev/github.com/ollama/ollama/api

Side note: seems to conflict with at least one other existing example PR: #951
EDIT: And conflict with my new draft PR #1036
This last one should probably be merged first

@pdxrlj
Copy link
Copy Markdown

pdxrlj commented Sep 26, 2024

Is it ready for use now

@treywelsh treywelsh mentioned this pull request Sep 26, 2024
9 tasks
@atljoseph
Copy link
Copy Markdown

Just bumping this :) Thanks for your contributions.

RE: llama3.1 don't properly use message history with tool results and I don't really know why

  • Tool call IDs might help the LLM connect the dots more.

Curious. What style of tool call orchestration is supported? Can the LLM invoke more than one tool call in a response? Can it dictate that specific tools be called concurrently for any reason, or is everything synchronous?

@sami-sweng
Copy link
Copy Markdown

so after a few tests, it looks to me that with this PR the model can correctly see the tools, can correctly query them, but do not see the tool call responses. I might not be using it correctly though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Langchainjs formally supports Ollama functions. Can you support that too?

5 participants