A production-ready minimal multi-channel AI gateway written in Rust, inspired by OpenClaw.
- Telegram Integration: Chat with your AI assistant via Telegram
- Multiple LLM Providers: Support for OpenAI and Ollama
- MCP Integration: Connect to Model Context Protocol servers for extended tool capabilities
- Skills System: Progressive disclosure architecture for modular AI capabilities (2025-2026 best practices)
- SQLite Persistence: Local-first conversation storage
- Structured Logging: journald/syslog support
- Microservices Architecture: Clean, maintainable codebase
- Production Ready: No "while true" loops - proper structured concurrency with streams
- Rust 1.75 or higher
- SQLite3
- Telegram Bot Token (from @BotFather)
- OpenAI API Key or Ollama running locally
macOS/Linux:
curl -fsSL https://raw.githubusercontent.com/unizhu/rustclaw/main/install.sh | bashWindows (PowerShell):
iex (irm https://raw.githubusercontent.com/unizhu/rustclaw/main/install.ps1)Download the latest release for your platform:
- macOS Intel:
rustclaw-x86_64-apple-darwin.tar.gz - macOS Apple Silicon:
rustclaw-aarch64-apple-darwin.tar.gz - Linux x86_64:
rustclaw-x86_64-unknown-linux-gnu.tar.gz - Linux ARM64:
rustclaw-aarch64-unknown-linux-gnu.tar.gz - Windows x86_64:
rustclaw-x86_64-pc-windows-msvc.zip
Then extract and install:
macOS/Linux:
tar -xzf rustclaw-<target>.tar.gz
chmod +x rustclaw-gateway
sudo mv rustclaw-gateway /usr/local/bin/Windows:
Extract the zip and add rustclaw-gateway.exe to your PATH.
# Clone the repository
git clone https://github.com/unizhu/rustclaw.git
cd rustclaw
# Build
cargo build --release
# Run
./target/release/rustclaw-gateway# Install and run Ollama
ollama serve
# Pull a model
ollama pull llama2
# Update config to use Ollama
# In rustclaw.toml:
# [providers]
# default = "ollama"Configuration uses a layered approach with the following priority (highest to lowest):
- Environment variables - Convenience vars like
TELEGRAM_BOT_TOKEN,OPENAI_API_KEY - Local config -
./rustclaw.tomlin the current working directory (for project-specific overrides) - Global config -
~/.rustclaw/rustclaw.toml(auto-created on first run if missing)
This is the primary configuration file, automatically created on first run:
[telegram]
bot_token = "" # Set via TELEGRAM_BOT_TOKEN env var
[providers]
default = "openai" # or "ollama"
[providers.openai]
api_key = "" # Set via OPENAI_API_KEY env var
model = "gpt-4o-mini"
base_url = "" # Optional: Set via OPENAI_BASE_URL env var
[providers.ollama]
base_url = "http://localhost:11434"
model = "llama3"
[database]
path = "rustclaw.db"
[logging]
level = "info" # trace, debug, info, warn, errorPlace a rustclaw.toml in your working directory to override specific settings for that project. For example:
[providers]
default = "ollama" # Use Ollama for this project
[providers.ollama]
model = "codellama" # Use a different model| Variable | Description | Config Path |
|---|---|---|
TELEGRAM_BOT_TOKEN |
Telegram bot token | telegram.bot_token |
OPENAI_API_KEY |
OpenAI API key | providers.openai.api_key |
OPENAI_BASE_URL |
OpenAI base URL | providers.openai.base_url |
OLLAMA_BASE_URL |
Ollama base URL | providers.ollama.base_url |
RUSTCLAW__* |
Any config value | Uses __ as separator |
RustClaw uses a service-oriented architecture with Tokio channels for communication:
┌─────────────────────────────────────────────────────────┐
│ Gateway Service │
│ (Orchestrator - manages lifecycle, routing, shutdown) │
└────────┬─────────────────────────────────┬──────────────┘
│ │
┌────▼────┐ ┌────▼─────┐
│ Channel │ │ Provider │
│ Service │ │ Service │
│(Telegram)│ │(OpenAI+ │
└────┬────┘ │ Ollama) │
│ └────┬─────┘
│ │
┌────▼────────────────────────────────▼─────┐
│ Persistence Service (SQLite) │
│ Logging Service (journald) │
└────────────────────────────────────────────┘
RustClaw supports OpenAI-compatible function calling (tool calling). You can register custom tools that the model can call:
use rustclaw_provider::{ToolFunction, ToolRegistry, ProviderService};
use rustclaw_types::Tool;
use anyhow::Result;
// Define a custom tool
pub struct WeatherTool;
impl ToolFunction for WeatherTool {
fn definition(&self) -> Tool {
Tool::function(
"get_weather",
"Get current weather for a location",
serde_json::json!({
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "City and country, e.g. 'Paris, France'"
}
},
"required": ["location"],
"additionalProperties": false
}),
)
}
fn execute(&self, args: serde_json::Value) -> Result<serde_json::Value> {
let location = args["location"].as_str().unwrap();
// Your weather API logic here
Ok(serde_json::json!({
"location": location,
"temperature": "22°C",
"condition": "sunny"
}))
}
}
// Register tools
let mut registry = ToolRegistry::new();
registry.register(Box::new(WeatherTool));
// Create provider with tools
let service = ProviderService::with_tools(provider, registry);
// Use agentic loop for automatic tool execution
let response = service.complete_agentic(&messages, "What's the weather in Paris?", 5).await?;EchoTool- Simple echo for testingCurrentTimeTool- Get current date/time
RustClaw supports the Model Context Protocol (MCP) for extending AI capabilities with external tools:
MCP is an open protocol that enables AI models to interact with external tools and services. It provides:
- Tool Discovery: Automatic tool registration
- Secure Execution: Sandboxed tool invocation
- Multiple Transports: stdio (local) and HTTP (remote)
In rustclaw.toml:
[mcp]
startup_timeout = 10 # seconds
[mcp.servers]
# Filesystem access (stdio transport)
filesystem = "npx -y @modelcontextprotocol/server-filesystem /tmp"
# GitHub API (requires GITHUB_TOKEN env var)
github = "mcp-server-github"
# Remote server with authentication (HTTP transport)
[mcp.servers.web-search]
url = "https://api.example.com/mcp"
headers = { Authorization = "Bearer your_api_key" }
# With explicit command and environment
[mcp.servers.custom]
command = "mcp-server-custom"
args = ["--port", "3000"]
env = { API_KEY = "your_key" }- stdio - Local MCP servers (npm packages, Python scripts)
- HTTP - Remote MCP servers with Bearer token auth
- Streamable HTTP - Modern HTTP transport with streaming support
- @modelcontextprotocol/server-filesystem - File system operations
- mcp-server-github - GitHub API integration
- mcp-server-postgres - PostgreSQL database access
- @z_ai/mcp-server - Zhipu AI tools (web search, code execution)
RustClaw features a progressive disclosure skills system based on 2025-2026 AI agent research:
Skills are modular capabilities that extend your AI's expertise without bloating context:
- Phase 1 (Discovery): Load skill metadata (name + description) at startup (~50-100 tokens/skill)
- Phase 2 (Activation): Load full skill content on-demand when task matches
mkdir -p ~/.rustclaw/skills/code-reviewerCreate ~/.rustclaw/skills/code-reviewer/SKILL.md:
---
name: code-reviewer
description: Reviews code for best practices. Use when reviewing code.
---
# Code Reviewer
## Instructions
When reviewing code:
1. Check for security vulnerabilities
2. Verify best practices
3. Suggest improvements
## Output Format
[Expected output structure]In rustclaw.toml:
[skills]
directories = [
"~/.rustclaw/skills", # Personal skills
"./.rustclaw/skills", # Project skills (shared)
"./examples/skills" # Example skills
]See examples/skills/ for:
- code-reviewer - Code review with best practices
- generating-commit-messages - Conventional commit messages
- brainstorming - Requirements exploration before implementation
For detailed documentation, see crates/rustclaw-skills/README.md.
# Run tests
cargo test
# Run clippy
cargo clippy
# Format code
cargo fmt
# Run in development mode
cargo run[Unit]
Description=RustClaw AI Assistant
After=network.target
[Service]
Type=simple
User=rustclaw
WorkingDirectory=/opt/rustclaw
ExecStart=/opt/rustclaw/rustclaw-gateway
Restart=on-failure
Environment="TELEGRAM_BOT_TOKEN=your_token"
Environment="OPENAI_API_KEY=your_key"
[Install]
WantedBy=multi-user.target# Build image
docker build -t rustclaw .
# Run container
docker run -d \
-e TELEGRAM_BOT_TOKEN="your_token" \
-e OPENAI_API_KEY="your_key" \
-v rustclaw-data:/data \
rustclaw- OpenAI-compatible tool calling support
- MCP (Model Context Protocol) client support
- Skills system with progressive disclosure architecture
- Additional channels (Slack, Discord)
- Web UI for management
- Conversation export/import
- Multi-tenancy support
- Metrics and monitoring
- Hot configuration reload
- Dynamic skill loading from plugins
MIT License - see LICENSE for details.
- Inspired by OpenClaw
- Built with Rust and Tokio