Skip to content

axio-agent/axio

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

axio

PyPI Python License: MIT

Minimal, streaming-first, protocol-driven foundation for LLM-powered agents.

One dependency (pydantic). Three protocols. An agent loop that just works.

Features

  • Streaming agent looprun_stream() yields typed events as they arrive; no buffering, no polling
  • Three clean protocolsCompletionTransport, ContextStore, PermissionGuard; swap any piece without touching the rest
  • Concurrent tool dispatch — all tool calls in a turn run via asyncio.gather automatically
  • Context compactioncompact_context() summarises old history to stay within token limits
  • Testing helpersStubTransport, make_tool_use_response(), make_echo_tool() ship in axio.testing
  • Plugin-ready — entry-point groups (axio.tools, axio.transport, axio.guards) for drop-in extensions

Installation

pip install axio

Quick start

import asyncio
from axio import Agent
from axio.context import MemoryContextStore
from axio.tool import Tool, ToolHandler

# 1. Define a tool
class Greet(ToolHandler):
    """Return a greeting for the given name."""
    name: str

    async def __call__(self) -> str:
        return f"Hello, {self.name}!"

greet_tool = Tool(name="greet", description="Greet someone by name", handler=Greet)

# 2. Wire up the agent (transport comes from an axio-transport-* package)
from axio_transport_openai import OpenAITransport

transport = OpenAITransport(api_key="sk-...", model="gpt-4o-mini")
agent = Agent(system="You are helpful.", tools=[greet_tool], transport=transport)

# 3. Run
async def main() -> None:
    ctx = MemoryContextStore()
    async for event in agent.run_stream("Please greet Alice", ctx):
        print(event)

asyncio.run(main())

Architecture

  User message
       │
       ▼
  ┌─────────┐   stream()    ┌─────────────────────┐
  │  Agent  │ ────────────▶ │ CompletionTransport  │
  │  loop   │ ◀──────────── │ (OpenAI, Nebius, …)  │
  └─────────┘  StreamEvent  └─────────────────────┘
       │
       │ tool_use?
       ▼
  ┌──────────┐   check()   ┌─────────────────┐
  │   Tool   │ ──────────▶ │ PermissionGuard │
  │ handler  │             │ (path, LLM, …)  │
  └──────────┘             └─────────────────┘
       │
       ▼
  ┌──────────────┐
  │ ContextStore │  append() / get_history() / fork() / compact()
  └──────────────┘

Protocols

CompletionTransport

from typing import Protocol, runtime_checkable
from collections.abc import AsyncIterator
from axio.events import StreamEvent
from axio.messages import Message
from axio.tool import Tool

@runtime_checkable
class CompletionTransport(Protocol):
    def stream(
        self, messages: list[Message], tools: list[Tool], system: str
    ) -> AsyncIterator[StreamEvent]: ...

ContextStore

class ContextStore(ABC):
    async def append(self, message: Message) -> None: ...
    async def get_history(self) -> list[Message]: ...
    async def fork(self) -> ContextStore: ...   # branch conversation
    async def clear(self) -> None: ...
    async def close(self) -> None: ...

PermissionGuard

@runtime_checkable
class PermissionGuard(Protocol):
    async def check(self, handler: ToolHandler) -> ToolHandler: ...

Stream events

Event Description
TextDelta Incremental assistant text chunk
ToolUseStart Tool call begins (name + id)
ToolInputDelta Streaming JSON fragment for tool arguments
ToolResult Tool execution result
IterationEnd One LLM round complete — carries Usage + StopReason
Error Transport or tool exception
SessionEndEvent Agent loop finished — carries total Usage

Tools

from axio.tool import Tool, ToolHandler

class Summarise(ToolHandler):
    """Summarise the given text in one sentence."""
    text: str
    max_words: int = 20

    async def __call__(self) -> str:
        # your implementation
        return "..."

tool = Tool(
    name="summarise",
    description="Summarise text",   # overrides docstring if set
    handler=Summarise,
    concurrency=4,                   # max parallel executions
)

Testing

from axio.testing import (
    StubTransport,
    make_tool_use_response,
    make_text_response,
    make_ephemeral_context,
    make_echo_tool,
)

async def test_agent_calls_tool():
    transport = StubTransport([
        make_tool_use_response("echo", tool_input={"msg": "hi"}),
        make_text_response("Done"),
    ])
    agent = Agent(system="", tools=[make_echo_tool()], transport=transport)
    result = await agent.run("say hi", make_ephemeral_context())
    assert result == "Done"

Plugin entry points

[project.entry-points."axio.tools"]
my_tool = "my_package:MyHandler"

[project.entry-points."axio.transport"]
my_backend = "my_package:MyTransport"

[project.entry-points."axio.guards"]
my_guard = "my_package:MyGuard"

Ecosystem

Package Purpose
axio-transport-openai OpenAI-compatible transport
axio-transport-nebius Nebius AI Studio transport
axio-transport-codex ChatGPT OAuth transport
axio-tools-local Shell, file, Python tools
axio-tools-mcp MCP server bridge
axio-tools-docker Docker sandbox tools
axio-tui Textual TUI application
axio-tui-rag RAG / semantic search plugin
axio-tui-guards Permission guard plugins

License

MIT

About

Minimal, streaming-first, protocol-driven foundation for LLM-powered agents

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors