A beehive where hundreds of drones program in parallel.
DevClaw's open-source kernel β an autonomous AI coding agent that takes a task description, picks tools, edits files, runs commands, and reports back. Single Go binary. Zero dependencies. Cross-platform.
DevClaw is an open-source autonomous coding agent in the spirit of Claude Code and Aider, with three differentiators:
- π Swarm-native β built-in
AgentandParalleltools let one drone spawn sub-drones (up to 5 parallel), forming a task tree. - π Role + permission system β five roles (
dev,test,ops,sense,scout) Γ three permission tiers (readonly,workspace_write,full_access) enforced as hard constraints. - π§ Project knowledge built in β
DRONE.md(likeCLAUDE.md) and.drone/skills/*.mdare auto-injected into the system prompt.
Everything here is Apache-2.0 β free forever, no asterisk:
- Runtime β agent loop, context compression, trajectory logging
- 13 built-in tools β
bash,file_read/file_write,multi_edit,agent,parallel,undo,bash_approval, β¦ - 5 roles β
dev,test,ops,sense,scout(each with its own default tools + permissions) - MCP client β connect to any Model Context Protocol server over stdio
- Provider β any OpenAI-compatible LLM (Ollama, OpenAI, StarAI, DeepSeek, β¦) with retry + streaming
- Git worktree isolation β automatic per-task isolation so drones never step on each other
- CLI β
drone run,drone roles,drone version(one Go binary, zero deps, cross-platform)
The StarClaw team maintains some closed-source tooling that wraps this kernel for internal use. It is not open-source and not for sale:
Forgeβ issue trackingPheromoneβ event busOverlordβ fleet orchestrationAbathurβ skill distillation
You don't need any of them. The kernel in this repo is fully functional standalone. If you want similar functionality, fork it β Apache-2.0 lets you build anything on top.
# From source (requires Go 1.24+)
go install github.com/yinhe/devclaw/cmd/drone@latest
# Or build manually
git clone https://github.com/yinhe/devclaw.git
cd devclaw
go build -o drone ./cmd/drone# Option A: OpenAI / StarAI / any OpenAI-compatible endpoint
export DRONE_API_KEY=sk-xxx
export DRONE_BASE_URL=https://api.openai.com/v1 # or your provider
export DRONE_MODEL=gpt-4o
# Option B: Local Ollama (zero cost, offline)
ollama pull qwen3-coder
# DRONE_API_KEY and DRONE_BASE_URL auto-default to Ollama if unsetdrone run --task "Add an English README and update the docs link"
drone run --task-file task.md --role dev --worktree
drone run "refactor this function to use generics" --quiet$ drone roles
Available roles:
dev Software development β architecture, coding, debugging, documentation (permission: workspace_write)
test Testing β test creation, regression testing, coverage analysis (permission: readonly)
ops Operations β deployment, health checks, infrastructure management (permission: full_access)
sense Sensing β feedback collection, anomaly detection, insight generation (permission: readonly)
scout Scouting β data collection, competitor analysis, external research (permission: readonly)Each role auto-injects its own system-prompt section. Permission tiers gate
which tools the agent can call (e.g., readonly cannot use Write/Edit).
| Tool | Description | Min permission |
|---|---|---|
Read |
Read file with line numbers, offset, limit | readonly |
ListDir |
List directory contents | readonly |
Glob |
Pattern-based file search | readonly |
Grep |
Content search using ripgrep semantics | readonly |
Bash |
Run shell command (timeout + output truncation) | readonly+ |
Write |
Create/overwrite a file | workspace_write |
Edit |
Exact string replace (unique match required) | workspace_write |
MultiEdit |
Multiple edits to one file (atomic) | workspace_write |
Patch |
before/after block replace | workspace_write |
Undo |
Revert the most recent file modification | workspace_write |
Agent |
Spawn a sub-drone for a focused subtask | workspace_write |
Parallel |
Spawn up to 5 sub-drones concurrently | workspace_write |
External MCP tools are loaded from .drone/mcp.json (stdio protocol).
Drop a DRONE.md (or .drone/DRONE.md) at your project root and DevClaw will
auto-inject it into every task's system prompt. Example:
# Project conventions
- Go 1.24, module path: github.com/myorg/myapp
- Test command: `go test ./...`
- Style: gofmt, no global state, table-driven tests
- Deploy: `git push origin main` triggers CI
# Architecture
- `cmd/server` β HTTP entry point
- `internal/` β business logic
- `pkg/` β exported helpersDrop domain-specific skills into .drone/skills/*.md β each Markdown file is
appended to the system prompt as a discrete skill block.
Create .drone/mcp.json:
{
"servers": [
{"name": "fs", "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem", "."]},
{"name": "github", "command": "npx", "args": ["-y", "@modelcontextprotocol/server-github"]}
]
}DevClaw will spawn each server over stdio and register all of its tools as
namespaced (fs__read_file, github__create_issue, β¦).
Inside a task, the model can call Agent (one sub-drone) or Parallel (up to
five concurrent sub-drones). Sub-drones inherit the parent's tool registry but
get their own scratch context, role, and 20-turn cap.
parent drone (dev)
βββ sub-drone (dev) "implement login API"
βββ sub-drone (test) "write tests for login"
βββ sub-drone (scout) "research bcrypt vs argon2"
Combine with --worktree and each sub-drone runs in its own git branch β fail
the whole tree with no cleanup needed.
.
βββ cmd/drone/ CLI entry point
βββ internal/
β βββ runtime/ LLM <-> Tool agent loop, context compression, git ctx
β βββ tool/ 13 built-in tools + Agent/Parallel
β βββ role/ 5 roles with system-prompt fragments
β βββ mcp/ Model Context Protocol stdio client
β βββ provider/ OpenAI-compatible LLM provider + retry
β βββ config/ Config + DRONE.md/Skills loading
β βββ worktree/ Git worktree isolation
βββ go.mod
Done (shipped in v0.1.x):
- Runtime with 13 built-in tools + Agent/Parallel fan-out
- 5 roles Γ 3 permission tiers
- MCP stdio client
- Git worktree isolation
- Trajectory logging
- Cross-platform release binaries (linux / macOS / windows Γ amd64 / arm64)
Next up (OSS only β contributions welcome):
- More providers: Anthropic-native, Gemini-native,
llama.cpp - More tools:
web_fetch,browser,screenshot,sql_query - More roles:
frontend,backend,security-reviewer - Homebrew tap + Scoop bucket for one-command install
- Snapshot releases on every
maincommit
The StarClaw team works on additional (closed-source) tooling on top of this kernel internally. Those plans are not part of this repo's roadmap and are not committed to any public release schedule.
Apache-2.0. See LICENSE.
DevClaw is part of the StarClaw ecosystem.