Skip to content

codex-mohan/spectra

Repository files navigation

Spectra

Minimal, ultra-fast, multi-language AI agent framework with a Rust core

Rust TypeScript Python
MIT License Status


A construction kit, not a pre-built house β€” ship only primitives that enable developers to build anything beyond the core without fighting the framework.

All SDKs (Rust, TypeScript, Python) are thin bindings over the same Rust core with identical behavior across languages.

✨ Key Features

  • Rust Core β€” Zero-cost abstractions, memory safety, native performance. No unsafe in core logic (FFI boundaries only).
  • Streaming-First β€” All LLM providers stream SSE by default. Event-driven architecture with real-time updates.
  • Multi-Language SDKs β€” Rust, TypeScript (via napi-rs), Python (via PyO3). Same API surface, same behavior.
  • Provider Abstraction β€” Single LlmClient trait. Built-in Anthropic, OpenAI, OpenRouter, Groq support.
  • Tool System β€” Trait-based with concurrent dispatch. Fluent ToolBuilder for ergonomic construction.
  • Agent Loop β€” Multi-turn with automatic tool dispatch, delta accumulation, and event streaming.
  • Extension Hooks β€” Before/after tool calls, agent/turn lifecycle. Composable middleware pattern.
  • No OpenSSL β€” Pure Rust TLS via rustls. No C dependencies.
  • Typed Errors β€” miette diagnostics with helpful messages across all error variants.

πŸ› οΈ Technology Stack

Component Technologies
Core Rust Tokio
HTTP Reqwest rustls Β· SSE streaming
TypeScript napi-rs Zod
Python PyO3 Pydantic maturin
Tooling Turborepo pnpm Β· cargo-nextest

πŸ—οΈ Project Structure

spectra/
β”œβ”€β”€ packages/
β”‚   └── core/                  # spectra-core β€” Rust core library
β”‚       β”œβ”€β”€ src/agent.rs       # Agent orchestrator (multi-turn loop)
β”‚       β”œβ”€β”€ src/llm.rs         # LlmClient trait, Model, Provider
β”‚       β”œβ”€β”€ src/tool.rs        # Tool trait, ToolRegistry, ToolBuilder
β”‚       β”œβ”€β”€ src/event.rs       # StreamEvent, ContentDelta, EventChannel
β”‚       β”œβ”€β”€ src/messages.rs    # Message types (User/Assistant/ToolResult)
β”‚       └── src/error.rs       # SpectraError with miette diagnostics
β”œβ”€β”€ crates/
β”‚   β”œβ”€β”€ spectra-http/          # HTTP LLM provider clients
β”‚   β”‚   β”œβ”€β”€ src/anthropic.rs   # Anthropic Messages API + SSE streaming
β”‚   β”‚   └── src/openai.rs      # OpenAI Chat Completions + SSE streaming
β”‚   β”œβ”€β”€ spectra-rs/            # Rust SDK (re-exports + builder)
β”‚   β”‚   β”œβ”€β”€ src/extension.rs   # Extension hooks (before/after lifecycle)
β”‚   β”‚   └── models.toml        # Built-in model definitions
β”‚   β”œβ”€β”€ spectra-napi/          # TypeScript bindings (napi-rs)
β”‚   └── spectra-pyo3/          # Python bindings (PyO3)
β”œβ”€β”€ packages/
β”‚   β”œβ”€β”€ spectra-ts/            # TypeScript SDK
β”‚   └── spectra-py/            # Python SDK
└── .github/workflows/        # CI/CD (Rust, TS, Python, Release)

🏁 Getting Started

Prerequisites

  • Rust 1.85+ (edition 2024)
  • Node.js 18+ and pnpm 9+ (for TypeScript SDK)
  • Python 3.11+ (for Python SDK)

Rust

[dependencies]
spectra-rs = "0.2"
use spectra_rs::prelude::*;
use spectra_http::OpenAIClient;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = OpenAIClient::from_env()?;

    let agent = AgentBuilder::new()
        .model(Model::openai("gpt-4o"))
        .system_prompt("You are a helpful assistant.")
        .build(client);

    let (mut rx, _channel) = agent.run("Hello!".to_string()).await?;

    while let Some(event) = rx.recv().await {
        println!("{:?}", event?);
    }

    Ok(())
}

TypeScript

pnpm add @spectra/sdk
import { Agent, anthropic, defineTool } from "@spectra/sdk";
import { z } from "zod";

const searchTool = defineTool({
  name: "search",
  description: "Search the web",
  schema: z.object({ query: z.string() }),
});

const agent = new Agent({
  model: anthropic("claude-sonnet-4-5"),
  systemPrompt: "You are a helpful assistant.",
  tools: [searchTool],
});

for await (const event of agent.prompt("What is Rust?")) {
  if (event.type === "message_update") {
    process.stdout.write(event.delta.delta ?? "");
  }
}

Python

pip install spectra-sdk
from spectra import Agent, openai

agent = Agent({
    "model": {"provider": "openai", "id": "gpt-4o"},
    "system_prompt": "You are a helpful assistant.",
    "tools": [],
})

async for event in agent.prompt("Hello!"):
    print(event)

πŸ”Œ Supported Providers

Provider Class / Model Streaming Tool Use Custom Base URL
Anthropic AnthropicClient βœ… SSE βœ… βœ…
OpenAI OpenAIClient βœ… SSE βœ… Function calling βœ…
OpenRouter OpenAIClient βœ… SSE βœ… βœ… (default)
Groq OpenAIClient βœ… SSE βœ… βœ…
Custom Implement LlmClient βœ… βœ… β€”

🎯 Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  TypeScript  β”‚  β”‚   Python    β”‚  β”‚    Rust     β”‚
β”‚  @spectra/sdkβ”‚  β”‚ spectra-sdkβ”‚  β”‚ spectra-rs  β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”˜
       β”‚ napi-rs          β”‚ PyO3           β”‚ native
       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                  β”‚
         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”
         β”‚  spectra-core   β”‚  Agent Β· LlmClient Β· Tool Β· Event
         β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                  β”‚
         β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”
         β”‚  spectra-http   β”‚  AnthropicClient Β· OpenAIClient
         β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Every SDK is a thin binding over the same Rust core. The Agent loop, Tool dispatch, StreamEvent emission, and error handling are identical regardless of language.

πŸ“– API Surface

Core Traits

Trait Purpose
LlmClient LLM provider abstraction (complete, stream)
Tool Tool definition + execution (definition, execute)
Extension Lifecycle hooks (on_before_tool_call, on_after_tool_call, ...)
EventSink Event consumption

Agent Events

Event When
AgentStart Agent begins processing
TurnStart New LLM turn begins
MessageStart LLM response starts
MessageUpdate Content delta (text or tool call)
MessageEnd LLM response complete
TurnEnd Turn complete (may include tool results)
ToolExecutionStart/Update/End Tool dispatch lifecycle
AgentEnd Agent processing complete
Error Something went wrong

⚠️ Constraints

  • Zero unsafe policy β€” No unsafe in core logic. FFI boundaries only.
  • No OpenSSL β€” rustls only. No C dependencies.
  • Release profile β€” opt-level 3, thin LTO, codegen-units 1, panic=abort
  • Edition 2024 β€” Requires Rust 1.85+

πŸ§ͺ Testing

# Rust tests
cargo test --workspace

# TypeScript tests
cd packages/spectra-ts && pnpm test

# Integration tests (wiremock)
cargo test -p spectra-http

πŸ“¦ Building from Source

# Clone
git clone https://github.com/codex-mohan/spectra.git
cd spectra

# Build Rust core
cargo build --release

# Build TypeScript SDK
cd packages/spectra-ts
cargo build --release --package spectra-napi
pnpm install
pnpm build

# Build Python SDK
cd packages/spectra-py
maturin develop --release

If you found this helpful, please consider giving it a ⭐

About

A fast and minimal agent library built in rust

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors