Skip to content

Add example for running a local/on-prem LLM with MCP #62

@stex2005

Description

@stex2005

It would be really nice to have an example that shows how to connect MCP to a locally hosted LLM (e.g. vLLM, Ollama, LM Studio, custom API).

Right now, the workflow is clear for using existing MCP-enabled clients (like Claude Desktop), but some users are asking how to:

  • Point MCP to a self-hosted LLM endpoint
  • Run the LLM service alongside ROS and the MCP server

Suggestion:

  • Provide a sample config or script for connecting to a local LLM
  • Document the minimum requirements (e.g., JSON tool/function call support in the LLM API)

Metadata

Metadata

Labels

help wantedExtra attention is needed

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions