Build production-ready AI agents without rewriting your stack.
Composable. Deterministic. Inspectable.
Most agent frameworks are:
- Easy to prototype, hard to maintain
- Powerful, but structurally chaotic
- Difficult to debug
- Opaque in production
As systems grow, orchestration becomes fragile.
Cogent gives you:
- ✅ Structured agent orchestration
- ✅ Deterministic execution model
- ✅ First-class trace & audit
- ✅ Composable multi-agent workflows
- ✅ Built-in LiteLLM support (100+ models)
You can start simple — and scale without rewriting everything.
uv venv
source .venv/bin/activate
uv pip install -e ".[dev]"from cogent.agents.react import ReactAgent
from cogent.kernel import ToolPort, ToolCall, Result
class CalculatorTools(ToolPort):
async def call(self, state, call: ToolCall) -> Result:
if call.name == "add":
return Result(state, value=str(sum(call.args.values())))
return Result(state, value=f"Unknown tool: {call.name}")
agent = ReactAgent(model="anthropic/claude-sonnet-4.6", tools=CalculatorTools())
result = await agent.run("What's 2 + 2?")
print(result.value)agent = ReactAgent(model="anthropic/claude-sonnet-4.6")
async for chunk in agent.stream("Explain quantum computing simply"):
print(chunk, end="")Cogent supports:
- Agent routing
- Handoff
- Parallel execution
- Concurrent orchestration
See examples/multi_agent.py for a complete multi-agent workflow with handoff, routing, and concurrent execution.
from dataclasses import dataclass
from cogent import Agent
from cogent.structured import CallableSchema, PydanticSchema
@dataclass
class UserProfile:
name: str
email: str
def parse_profile(data: dict) -> UserProfile:
return UserProfile(name=data["name"], email=data["email"])
agent = Agent.start('{"name": "Alice", "email": "alice@example.com"}')
agent = agent.cast(CallableSchema(parse_profile))
result = await agent.run("state", env)
assert isinstance(result.value, UserProfile)from cogent.agents.react import ReactAgent, ReActState
agent = ReactAgent(model="anthropic/claude-sonnet-4.6", trace=True)
result = await agent.run("Analyze this market data")
# Inspect the trace
if result.trace:
for event in result.trace._events:
print(f"{event.action}: {event.info}")from cogent import Agent
from cogent.kernel import Result, Control
async def step1(state, value, env):
return Result(state, value=value + " processed", control=Control.Continue())
async def step2(state, value, env):
return Result(state, value=value + " finalized", control=Control.Continue())
workflow = Agent.start("initial").then(step1).then(step2)
result = await workflow.run("state", env)
print(result.value) # "initial processed finalized"Cogent has built-in LiteLLM integration supporting 100+ models (OpenAI, Anthropic, OpenRouter, etc.). Set any supported API key:
export OPENROUTER_API_KEY="your-key" # OpenRouter
export ANTHROPIC_API_KEY="your-key" # Anthropic
export OPENAI_API_KEY="your-key" # OpenAICogent is designed to be extensible. Implement your own provider by subclassing FormatterBase:
from cogent.providers.base import FormatterBase
from cogent.model import Message
class CustomProvider(FormatterBase):
support_tools_api = True
support_vision = False
async def format(self, messages: list[Message]) -> list[dict]:
# Convert Cogent messages to your provider's format
return [{"role": msg.role, "content": msg.content} for msg in messages]- Type checking:
pyright - Linting:
ruff - Tests:
pytest
pyright
ruff check src tests
pytestFor a complete understanding of the architectural design and principles, read DESIGN.md.