3 lines of code. Full observability for your A2A agents.
Your A2A agent processes 10 files. It crashes on file 6.
You see: 500 Internal Server Error.
What happened to files 1-5? Where exactly did it fail? What was the last successful state?
from binex_trace import trace
@trace.task("review")
def review_file(path):
trace.log("analyzing", path=path)
result = llm.review(path)
trace.checkpoint(result)
return resultNow you get structured JSON on stderr:
{"type": "task_start", "name": "review", "args_repr": "('app.py',)"}
{"type": "log", "message": "analyzing", "path": "app.py"}
{"type": "checkpoint", "label": "checkpoint", "data_preview": "{...}"}
{"type": "task_end", "name": "review", "status": "ok", "duration_s": 3.1}
Orchestrators parse this into trace trees:
├─ review(app.py) ok 3.1s
├─ review(main.py) ok 4.2s
├─ review(auth.py) err timeout
│ └─ checkpoint: "analyzed 120 lines"
├─ review(db.py) -- not reached
pip install binex-tracefrom binex_trace import trace
@trace.task("process")
def process(data):
trace.log("started", size=len(data))
result = do_work(data)
trace.checkpoint(result)
return resultwith trace.span("download") as s:
data = download(url)
s.log("downloaded", size=len(data))
s.checkpoint(data)trace.metric("tokens_used", 1523)
trace.metric("latency_ms", 340.5, model="claude-3")@trace.task("fetch")
async def fetch(url):
trace.log("fetching", url=url)
return await client.get(url)
async with trace.span("batch") as s:
results = await asyncio.gather(*tasks)
s.log("done", count=len(results))| Method | Description |
|---|---|
@trace.task(name) |
Decorator. Traces sync/async function as a named task. |
trace.span(name) |
Context manager (sync + async). Groups operations. |
trace.log(msg, **kw) |
Emit a structured log event. |
trace.checkpoint(data) |
Save recoverable state. |
trace.metric(name, value) |
Emit a numeric metric. |
fastapi_a2a.py- A2A server with FastAPIlangchain_agent.py- LangChain agent tracingcrewai_crew.py- CrewAI multi-agent crewclaude_sdk.py- Anthropic SDK with token metricsbash_wrapper.sh- Shell script integration
Any Python code. Any A2A agent. Tested with:
- LangChain, CrewAI, AutoGen
- Anthropic SDK, OpenAI SDK
- FastAPI, Flask
- Binex orchestrator (automatic trace parsing)
- Plain scripts, bash
Emits structured JSON to stderr. Zero dependencies. No collector needed.
Orchestrators parse stderr lines where _binex_trace: true and build trace trees.
MIT