Skip to content

using local ollama LLM example is broken https://github.com/evalstate/fast-agent/tree/main/examples/mcp_basic_ollama_agent #8

@suvarchal

Description

@suvarchal

error:
Image

i guess the core issue is probably trying to use 2 kinds of OpenAI-API models in the example https://github.com/evalstate/fast-agent/tree/main/examples/mcp_basic_ollama_agent

and default augumented openai llm is gpt-4o and the config uses one url
https://github.com/evalstate/fast-agent/blob/8891d00e5d069eaddebadf8bbf1d775426e3ef1c/examples/mcp_basic_ollama_agent/mcp_agent.config.yaml#L23-24
and the example needs 2 openai model api's and keys.

fast-agent is wonderful, I think having atleast one purely-local llm with mcp tools is worth the effort.

Metadata

Metadata

Assignees

Labels

defectSomething isn't workingdocumentationImprovements or additions to documentationfeatureFeature Request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions