A complete Rust rewrite of OpenCode — the open-source AI coding agent.
We're rebuilding OpenCode from the ground up in 100% Rust for better performance, lower memory footprint, and a truly native experience.
Note: This is the open-source backend SDK derived from OpenCode. Orbit (our Unified Development Environment) is a separate, proprietary product that uses this SDK.
| Aspect | TypeScript/Bun | Rust |
|---|---|---|
| Memory | ~200-500MB runtime | ~20-50MB |
| Startup | 1-3 seconds | <100ms |
| Binary Size | Node/Bun runtime required | Single static binary |
| Native Integration | FFI overhead | Zero-cost Tauri integration |
| Concurrency | Event loop limitations | True parallelism with async/await |
By rewriting in Rust, we get:
- Instant startup — no runtime initialization
- Minimal memory — critical for running alongside heavy IDEs
- Native Tauri integration — seamless desktop app without IPC overhead
- Single binary distribution — no dependencies, no node_modules
- True concurrency — parallel tool execution without blocking
Orbit Rust SDK is an open-source AI agent backend that handles:
- LLM provider communication — Anthropic, OpenAI, Google, AWS Bedrock, local models
- Tool execution — bash, file operations, code analysis
- Session management — conversation state and context
- Real-time streaming — event handling and SSE
- OAuth authentication — Claude Pro, ChatGPT Plus, GitHub Copilot
| Feature | Orbit SDK | Claude Agent SDK | Codex CLI | OpenCode |
|---|---|---|---|---|
| Language | Rust | Python/TypeScript | TypeScript | TypeScript |
| Memory Usage | ~20-50MB | ~200-400MB | ~150-300MB | ~200-500MB |
| Startup Time | <100ms | 1-2s | 1-2s | 1-3s |
| Binary Distribution | Single static binary | Runtime required | Runtime required | Runtime required |
| Provider Agnostic | ✅ All major providers | ❌ Anthropic only | ❌ OpenAI only | ✅ All major providers |
| Open Source | ✅ MIT | ❌ Proprietary | ✅ Apache 2.0 | ✅ MIT |
| Tool Execution | ✅ Parallel | ✅ Sequential | ✅ Sequential | ✅ Sequential |
Once released, install via cargo:
cargo install orbit-sdkOr add to your Cargo.toml:
[dependencies]
orbit-sdk = "0.1"use orbit_sdk::{Agent, Provider};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize with your preferred provider
let agent = Agent::builder()
.provider(Provider::Anthropic)
.model("claude-sonnet-4-20250514")
.build()?;
// Send a message
let response = agent
.chat("Help me refactor this function")
.await?;
println!("{}", response);
Ok(())
}| Provider | Status | Models |
|---|---|---|
| Anthropic | ✅ | Claude 4, Claude 3.5, Claude 3 |
| OpenAI | ✅ | GPT-4o, GPT-4, o1, o3 |
| ✅ | Gemini 2.0, Gemini 1.5 | |
| AWS Bedrock | ✅ | Claude, Titan, Llama |
| Ollama | ✅ | Any local model |
| OpenRouter | ✅ | 100+ models |
orbit-sdk/
├── orbit-core/ # Foundation utilities (IDs, locks, filesystem)
├── orbit-bus/ # Event pub/sub system
├── orbit-storage/ # JSON file storage
├── orbit-config/ # Configuration management
├── orbit-provider/ # AI provider adapters
├── orbit-session/ # Session management
├── orbit-tools/ # Agent tools (bash, file ops, LSP)
├── orbit-mcp/ # Model Context Protocol client
└── orbit-server/ # HTTP API with SSE
| Tool | Description |
|---|---|
bash |
Execute shell commands with streaming output |
read |
Read files with line number support |
write |
Create new files |
edit |
Apply precise edits to existing files |
glob |
Find files by pattern |
grep |
Search file contents with regex |
lsp |
Language Server Protocol integration |
mcp |
Model Context Protocol servers |
Create orbit.toml in your project root:
[provider]
default = "anthropic"
model = "claude-sonnet-4-20250514"
[tools]
bash = { enabled = true, timeout = 120 }
read = { enabled = true }
write = { enabled = true }
edit = { enabled = true }
[permissions]
auto_approve = ["read", "glob", "grep"]
require_approval = ["bash", "write", "edit"]Or use environment variables:
export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export ORBIT_MODEL="claude-sonnet-4-20250514"We welcome contributions! Whether you're:
- Porting TypeScript to Rust — Pick a module and start converting
- Adding features — New tools, providers, or improvements
- Fixing bugs — In either the TS or Rust codebase
- Improving docs — Better examples, guides, and explanations
This SDK is free and open source. Use it, fork it, build on it.
Special thanks to OpenCode and Anomaly (SST) for building the original open-source AI coding agent.
OpenCode proved that a fully open-source, provider-agnostic AI agent could rival proprietary solutions. We're taking their excellent work and rebuilding it in Rust to push performance even further.
MIT License
Copyright (c) 2025 Recursive Labs Copyright (c) 2024-2025 SST/OpenCode (Original Work)
See LICENSE for details.
Website • Documentation • GitHub • Discord