Harbor is an AI-first Rust framework for building reusable AI solutions with first-class MCP support.
It is inspired by modern AI application platforms and documentation patterns like the ones in Coldbrew Cloud: a single developer surface for
- model providers
- tools and integrations
- memory/session state
- workflows and agents
- MCP server creation
- MCP client integration
- CLI scaffolding for new AI projects
harbor-core- tool traits
- tool registry
- application blueprint/builder
- shared framework errors
harbor-ai- provider abstraction
- chat/message types
- structured completion request/response model
- structured event streaming envelopes with run/stream IDs, sequencing, and delta offsets
- mock provider for local development and tests
- OpenAI-compatible provider client
- Anthropic provider client
- Ollama provider client
- provider retry / backoff / timeout / fallback helpers
- provider circuit-breaker suppression helper
- structured output helpers
- outbound trace-context injection on provider HTTP calls
harbor-memory- session memory trait
- in-memory implementation
- file-backed persistent session memory
- versioned persistence manifest + bootstrap migration seam
harbor-rag- document store abstraction
- in-memory + file-backed document stores
- document chunking helpers
- lexical retrieval
- prompt injection helpers for retrieved context
harbor-runtime- agent runtime
- streaming turn API with deterministic final-response reconstruction
- retrieval-aware turn execution
- lifecycle task primitives
- idempotent task enqueue / dedupe foundations
- in-memory + file-backed task stores
- versioned task store manifest + legacy migration seam
- lease-based background task runner
- workflow engine
- reusable execution context
- shared
HarborAppbootstrap entrypoint - signal-driven shutdown wiring
- app-level observability integration
harbor-mcp- JSON-RPC + MCP-inspired protocol types
- stdio framing (
Content-Length) - MCP server builder
- spawned stdio client transport
- HTTP transport for remote MCP integration
- local + HTTP integration clients
- resource + prompt endpoints
- capability reporting
- outbound trace-context injection for MCP HTTP calls
harbor-http- Axum-based HTTP ops surface
/healthcheck,/readycheck, and/metrics- request ID propagation via
x-request-id - request logging middleware
- configurable timeout/auth/rate-limit middleware for app routes
- env-driven HTTP config
- graceful shutdown hook support
harbor-observability- tracing/log bootstrap
- Prometheus recorder setup
- OTEL trace exporter bootstrap
- metrics renderer for the HTTP surface
harbor-clinewcommand to scaffold a new AI solutiondoctorcommand to explain workspace capabilities- production-readiness visibility for persistence/task foundations
harbor/
crates/
harbor-core/
harbor-ai/
harbor-memory/
harbor-rag/
harbor-runtime/
harbor-mcp/
harbor-http/
harbor-observability/
harbor-cli/
docs/
ARCHITECTURE.md
ROADMAP.md
- Rust 1.86+
rust-toolchain.tomlis included so contributors land on a compatible toolchain by default
cargo run -p harbor-runtime --example hello_agentcargo run -p harbor-runtime --example streaming_agentStreaming events now carry stable run_id / stream_id metadata plus monotonic sequence numbers.
Delta events also expose byte offsets, and finished events always include the deterministically reconstructed final response text.
cargo run -p harbor-runtime --example retrieval_agentcargo run -p harbor-mcp --example echo_stdio_servercargo run -p harbor-mcp --example http_servercargo run -p harbor-http --example minimal_servercargo run -p harbor-runtime --example bootstrap_httpThis boots Harbor with:
/healthcheck/readycheck/metrics- request ID propagation via
x-request-id - incoming
traceparentextraction when OTEL is enabled - request logging middleware
- optional timeout/auth/rate-limit middleware for app routes
- signal-driven shutdown
- env-driven tracing/logging + metrics bootstrap
Optional OTEL envs:
HARBOR_OTEL_ENABLED=trueHARBOR_OTEL_ENDPOINT=http://127.0.0.1:4317
cargo run -p harbor-cli -- new my-ai-app --with-mcp-server- AI-first: providers, prompts, tools, memory, and workflows are first-class.
- MCP-native: easy to create MCP servers and integrate external MCP servers.
- Reusable: common building blocks for real AI products, not one-off demos.
- Rust-first: strongly typed, composable, and suitable for production-grade services.
GitHub Actions now runs:
cargo check --workspace --all-targetscargo test --workspace --no-run
- typed tool schemas via derive macros
- richer observability hooks
- Anthropic native streaming adapter
- Redis / Postgres state backends
- vector retrieval backends
- deployment templates for containerized AI services