The ROS MCP Client is a reference implementation of a Model Context Protocol (MCP) client, designed to connect directly with ros-mcp-server.
Instead of using a Desktop LLM client, it acts as a bridge that integrates an LLM, enabling natural-language interaction with any ROS or ROS2 robot.
ros-mcp-client
implements the LLM-side of the MCP protocol.
It can:
- Connect to a
ros-mcp-server
over MCP (stdio or HTTP). - Send natural language queries or structured requests to the robot without the need to integrate it with a Desktop LLM client
- Stream back feedback, sensor data, or responses from the server.
- Integrate with a local LLM (Gemini, Ollama, Nvidia NeMo).
In short, it lets you run an MCP-compatible client that speaks to robots via the MCP interface — useful for testing, local reasoning, or autonomous AI controllers.
-
Implements MCP client specification — plug-and-play with the ROS MCP server.
-
ROS-aware LLM interface — specialized prompts and handlers for robotics tasks.
-
Supports bidirectional streaming — send commands, receive real-time topic feedback.
-
LLM integration ready — use Gemini, Anthropic, or Ollama APIs as reasoning engines.
-
Offline-capable — works entirely within local or LAN environments.
The MCP client is version-agnostic (ROS1 or ROS2).
- ROS or ROS2 running with
rosbridge
- Active
ros-mcp-server
instance
- Clone the repository
git clone https://github.com/robotmcp/ros-mcp-client.git
cd ros-mcp-client
- Install dependencies
uv sync # or pip install -e .
-
Follow the setup guide for the Gemini Live client:
- Gemini Live Client - Google Gemini integration
-
Start
rosbridge
on the target robot
ros2 launch rosbridge_server rosbridge_websocket_launch.xml
ros-mcp-client/
├── clients/
│ ├── gemini_live/ # Full-featured Gemini client
│ │ ├── gemini_client.py # Main client script
│ │ ├── mcp.json # MCP server configuration
│ │ ├── setup_gemini_client.sh # Automated setup
│ │ └── README.md # Detailed setup guide
├── config/ # Shared configuration
├── scripts/ # Utility scripts
├── pyproject.toml # Python dependencies
└── README.md # This file
The project includes a comprehensive LLM client implementation:
- Full-featured Google Gemini integration
- Text-only mode optimized for WSL
- Real-time interaction with ROS robots
- Automated setup with
setup_gemini_client.sh
# Try the Gemini Live client
cd clients/gemini_live
./setup_gemini_client.sh
uv run gemini_client.py
We welcome community PRs with new client implementations and integrations!
We love contributions of all kinds:
- Bug fixes and documentation updates
- New features (e.g., Action support, permissions)
- Additional examples and tutorials
Check out the contributing guidelines and see issues tagged good first issue to get started.
This project is licensed under the Apache License 2.0.