An async-first Rust framework for building agentic AI systems.
- Provider-agnostic — swap OpenAI, Anthropic, or any custom LLM without changing agent logic
- Type-safe tools — implement
Toolwith typedInput/Output, or use the#[tool]macro - ReAct loop — built-in reason-act-observe agent with configurable guardrails
- Streaming — SSE streaming support for both OpenAI and Anthropic
- Composable — agents, tools, and memory backends are all trait-based and swappable
- Async-first — built on Tokio with zero blocking I/O
| Crate | Description |
|---|---|
relay-core |
Traits, types, error types — no I/O |
relay-providers |
OpenAI and Anthropic adapters + mock |
relay-agents |
ReAct reasoning loop and AgentBuilder |
relay-tools |
Built-in tools: calculator, datetime, HTTP |
relay-memory |
In-memory key-value and episodic memory |
relay-macros |
#[tool] proc macro |
relay-workflow |
Multi-agent orchestration (WIP) |
relay |
Unified re-export crate |
[dependencies]
relay = { path = "relay", features = ["openai"] }use relay::prelude::*;
use relay::AgentBuilder;
use relay_providers::openai::OpenAiProvider;
use relay_tools::CalculatorTool;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
dotenvy::dotenv().ok();
let agent = AgentBuilder::new()
.name("assistant")
.system_prompt("You are a helpful assistant.")
.provider(OpenAiProvider::from_env()?)
.tool(CalculatorTool)
.build()?;
let output = agent.run("What is 15 * 23?").await?;
println!("{}", output.answer.unwrap_or_default());
Ok(())
}Add your API key to a .env file at the workspace root:
OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...# openai (default)
cargo run --example live_agent --all-features
# anthropic
cargo run --example live_agent --all-features -- --provider anthropic
# custom prompt
cargo run --example live_agent --all-features -- "What day is it today?"
# mock provider (no key needed)
cargo run --example basic_agentcargo test --all-featuresMIT OR Apache-2.0