Skip to content

MCP support and responses API #6380

@mudler

Description

@mudler

Is your feature request related to a problem? Please describe.
I would like to use LocalAI with an MCP server. Currently, doing so would require using https://github.com/mudler/LocalAGI or crafting my own solution

Describe the solution you'd like
Recently I've been working on isolating the LocalAGI and LocalOperator(yet to be released) code in https://github.com/mudler/cogito . At this point, would be trivial to add a response API endpoint to LocalAI, which uses cogito and allows the user to specify a set of MCP server to connect to and give access to the tools to the LLM. Alternatively, we could have a special chat completion endpoint, for instance "/mcp/v1/completions" that automatically enables this behavior.

Describe alternatives you've considered
Keep things as is

Implementation steps

  • Create a Completion API compatible endpoint that uses https://github.com/mudler/cogito and connects to the LocalAI api server (to the chat completion endpoint)
  • Add MCP configurations to LocalAI, wire configuration as needed, and create Tools as expected by cogito from the user configuration and by dialing to the MCP server. Some code for this already exists in LocalAGI
  • Configure MCP and agent settings via model config file
  • Optionally (of just for later) optimize cogito to connect directly without have to dial and use the chat completion endpoint

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions