Quick Start · Comparison · Features · Architecture · Configuration
5MB binary · ~3MB RAM · <10ms cold start · Zero CVEs · Zero dependencies
MarsClaw is a multi-agent AI runtime written in Rust. It connects to Claude, GPT, Gemini, and local models to help you code, automate tasks, and orchestrate multi-agent workflows — all from a single binary with no dependencies.
$ marsclaw "add error handling to main.rs"
> read_file
✓ read_file
> edit_file
✓ edit_file
Added error wrapping with anyhow to all three return paths in main().
── claude-sonnet-4 │ 1.2K in / 523 out │ $0.012 session ──
# Install
cargo install marsclaw
# Or download a binary
curl -sSfL https://marsclaw.dev/install.sh | sh
# Set your API key (pick one)
export ANTHROPIC_API_KEY="sk-ant-..." # Anthropic
export GEMINI_API_KEY="..." # Google Gemini
export OPENAI_API_KEY="sk-..." # OpenAI
# Or use Ollama for free, fully offline — no key needed
# Interactive mode
marsclaw
# Single prompt
marsclaw chat "explain this codebase"
# Use with different providers
marsclaw -m claude-sonnet-4-20250514 "review this PR"
marsclaw -m gemini-2.5-flash "explain this code"
marsclaw -m gpt-4o "write tests for auth.rs"
marsclaw -m llama3.1 "refactor this function" # Ollama, free
# Web UI
marsclaw serve --addr :8080
# Chat bots
marsclaw telegram # Telegram bot
marsclaw discord # Discord bot
marsclaw slack # Slack bot
# Setup wizard
marsclaw init| Claude Code | Aider | Goose | Cursor | OpenHands | MarsClaw | |
|---|---|---|---|---|---|---|
| Language | TypeScript | Python | Python | Electron | Python | Rust |
| Install size | ~200 MB | ~150 MB | ~120 MB | ~400 MB | ~2 GB | 5 MB |
| Memory (idle) | ~150 MB | ~120 MB | ~100 MB | ~500 MB | ~1 GB | ~3 MB |
| Cold start | ~3s | ~2s | ~2s | ~5s | ~10s | <10ms |
| Runtime deps | Node.js | Python + pip | Python + pip | Chromium | Docker | 0 |
| Single binary | No | No | No | No | No | Yes |
| Known CVEs | Inherits npm | Inherits pip | Inherits pip | Chromium | Docker | 0 |
| Claude Code | Aider | Goose | Cursor | OpenHands | MarsClaw | |
|---|---|---|---|---|---|---|
| Anthropic (native) | Yes | Yes | Yes | Yes | Yes | Yes |
| OpenAI | No | Yes | Yes | Yes | Yes | Yes |
| Gemini | No | Yes | Yes | No | Yes | Yes |
| Ollama (offline) | No | Yes | Yes | No | Yes | Yes |
| OpenAI-compatible | No | Yes | Yes | No | Yes | Yes |
MarsClaw supports 4 native providers + any OpenAI-compatible endpoint (Groq, Together, DeepSeek, Azure, vLLM, LM Studio).
| Claude Code | Aider | Goose | Cursor | OpenHands | MarsClaw | |
|---|---|---|---|---|---|---|
| Pipeline (sequential) | No | No | No | No | Yes | Yes |
| Parallel (fan-out) | No | No | No | No | No | Yes |
| Debate (adversarial) | No | No | No | No | No | Yes |
| Supervisor (coordinator) | No | No | Yes | No | Yes | Yes |
| Sub-agent delegation | No | No | No | No | No | Yes |
| Claude Code | Aider | Goose | Cursor | OpenHands | MarsClaw | |
|---|---|---|---|---|---|---|
| Telegram | No | No | No | No | No | Yes |
| Discord | No | No | No | No | No | Yes |
| Slack | No | No | No | No | No | Yes |
| No | No | No | No | No | Yes | |
| No | No | No | No | No | Yes |
| Claude Code | Aider | Goose | Cursor | OpenHands | MarsClaw | |
|---|---|---|---|---|---|---|
| Web dashboard | No | No | No | Yes | Yes | Yes (embedded) |
| MCP client | Yes | No | Yes | No | No | Yes |
| Persistent memory | No | No | No | No | No | Yes |
| Skills / prompt packs | No | No | No | No | No | Yes (5 + installable) |
| Scheduled tasks | No | No | No | No | No | Yes (cron) |
| Cost tracking | Yes | Yes | No | No | No | Yes (daily/monthly) |
| Credential scanning | No | No | No | No | No | Yes |
| Tool approval | Yes | No | No | No | No | Yes (per danger level) |
| Session persistence | No | Yes | No | No | Yes | Yes (SQLite) |
| Hook system | No | No | No | No | No | Yes |
| Offline mode | No | Yes | Yes | No | No | Yes (Ollama) |
| Self-hosted | No | Yes | Yes | No | Yes | Yes |
| Open source | No | Yes | Yes | No | Yes | Yes (Apache-2.0) |
Connect to any major LLM provider — switch with a single flag:
- Anthropic — Claude Opus, Sonnet, Haiku (native Messages API with streaming)
- OpenAI — GPT-4o, GPT-4, o1 (+ any OpenAI-compatible endpoint)
- Google Gemini — Gemini 2.5 Flash, Pro
- Ollama — Llama 3, Mistral, CodeLlama, any local model (free, fully offline)
- Any OpenAI-compatible API — Groq, Together, DeepSeek, Azure, vLLM, LM Studio
Four built-in patterns for complex workflows:
- Pipeline — chain agents sequentially, output of one feeds the next
- Parallel — fan-out tasks to multiple agents, aggregate results
- Debate — adversarial multi-round discussion between agents with a judge
- Supervisor — coordinator agent delegates subtasks to specialist agents
| Tool | Description |
|---|---|
read_file |
Read files with line ranges |
write_file |
Create or overwrite files |
edit_file |
Surgical find-and-replace edits |
shell |
Execute shell commands with timeout |
list_files |
Recursive directory listing with glob |
search |
Content search across files (regex) |
git |
Read-only git operations (log, diff, status) |
Deploy your agent to any messaging platform:
- Telegram — long-polling bot with /start, /clear, /help commands
- Discord — Gateway WebSocket with real-time messaging
- Slack — Socket Mode with event-driven responses
- WhatsApp — Cloud API webhook (auto-mounts on serve)
- Instagram — Messenger API integration
- Web dashboard — embedded single-page UI, zero frontend build needed
- Skills system — 5 built-in prompt packs (coder, devops, writer, analyst, compliance) + installable
- Scheduler — cron expressions and interval-based task automation
- MCP support — JSON-RPC 2.0 client for Zapier, n8n, filesystem, custom MCP servers
- Persistent memory — episodic, semantic, and procedural memory backed by SQLite
- Hook system — before/after tool calls, LLM calls, and error events
- Security — credential scanning, path traversal guards, per-danger-level tool approval
- Cost tracking — per-model pricing with daily and monthly budget limits
- SQLite persistence — conversation history and sessions with zero config
- Config — YAML file +
MARSCLAW_*env var overrides
marsclaw [OPTIONS] [COMMAND]
Commands:
chat Chat interactively or run single prompt
serve Start HTTP server + Web UI
telegram Run as Telegram bot
discord Run as Discord bot
slack Run as Slack bot
whatsapp Run WhatsApp webhook bot
channels add Connect a messaging channel
channels list Show configured channels
channels remove Remove a channel
skills list Show available skills
skills install Install a skill from URL
skills use Set the active skill
init Interactive setup wizard
Options:
-c, --config Config file path
-m, --model Override model (e.g., claude-sonnet-4-20250514)
-v, --verbose Debug logging
-h, --help Print help
-V, --version Print version
src/ 10,500+ lines of Rust
main.rs CLI entry point (clap)
agent/ Agent loop, context builder, sub-agent orchestrator
bots/ Telegram, Discord, Slack, WhatsApp + channel management
config/ YAML + env var configuration + setup wizard
llm/ 4 providers (Anthropic, OpenAI, Gemini, Ollama) + cost + retry
platform/ Memory, hooks, MCP, scheduler, skills, security
server/ HTTP server (axum) + embedded Web UI + terminal REPL
store/ SQLite persistence (rusqlite)
tool/ 7 built-in tools (read, write, edit, shell, list, search, git)
types/ Shared types and traits
Config lives at ~/.marsclaw/config.yaml:
providers:
default: anthropic
anthropic:
api_key_env: ANTHROPIC_API_KEY
default_model: claude-sonnet-4-20250514
gemini:
api_key_env: GEMINI_API_KEY
default_model: gemini-2.5-flash
openai:
api_key_env: OPENAI_API_KEY
default_model: gpt-4o
ollama:
default_model: llama3.1
agent:
max_turns: 25
enable_streaming: true
temperature: 0.0
cost:
daily_budget: 10.0
monthly_budget: 100.0
security:
scan_credentials: true
path_traversal_guard: true
# MCP servers
mcp:
- name: n8n
command: npx
args: ["-y", "@anthropic/mcp-n8n", "--webhook-url", "http://localhost:5678"]
# WhatsApp webhook (auto-mounts on serve)
whatsapp:
phone_number_id: "123456789"
access_token: "EAAx..."
verify_token: "marsclaw_verify"
# Scheduled tasks
scheduler:
tasks:
- id: daily-report
name: "Daily Summary"
schedule: "0 9 * * *"
prompt: "Generate a daily summary of recent changes"
channel: logEnvironment variables override config: MARSCLAW_PROVIDER=ollama, MARSCLAW_MODEL=llama3.1, etc.
git clone https://github.com/Marsstein/marsclaw-rs.git
cd marsclaw-rs
cargo build # debug build
cargo test # run 43 tests
cargo clippy # lint
cargo build --release # optimized 5MB binaryApache-2.0