TraceRoot is an open-source observability platform for AI agents — Capture traces, debug with AI that sees your source code and Github history.
| Feature | Description |
|---|---|
| Tracing | Capture LLM calls, agent actions, and tool usage via OpenTelemetry-compatible SDK. Intelligently surfaces the traces that matter — noise filtered, signal prioritized. |
| Agentic Debugging | AI that sees all your traces, connects to a sandbox with your production source code, identifies the exact failing line, and correlates the failure with your GitHub commits, PRs, and issues. BYOK support for any model provider. |
-
Traces alone don't scale.
As AI agent systems grow more complex, manually sifting through every trace is unsustainable. TraceRoot selectively screens your traces — filtering noise and surfacing only the ones that actually need attention, so you spend time fixing problems, not hunting for them.
-
Debugging AI agent systems is painful.
Root-causing failures across agent hallucinations, tool call instabilities, and version changes is hard. TraceRoot's AI connects to a sandbox running your production source code, identifies the exact failing line, and cross-references your GitHub history — commits, PRs, open issues and creates PR to fix it.
-
Fully open source, no vendor lock-in.
Both the observability platform and the AI debugging layer are open source. BYOK support for any model provider — OpenAI, Anthropic, Gemini, xAI, DeepSeek, OpenRouter, Kimi, GLM and more.
Full documentation available at traceroot.ai/docs.
The fastest way to get started. Ample storages and LLM tokens for testing, no credit card needed. Sign up here!
-
Developer mode: Run TraceRoot locally to contribute.
# Get a copy of the latest repo git clone https://github.com/traceroot-ai/traceroot.git cd traceroot # Hosted the infras in docker and app itself locally make dev
For more details, see CONTRIBUTING.md.
-
Local docker mode: Run TraceRoot locally to test.
# Get a copy of the latest repo git clone https://github.com/traceroot-ai/traceroot.git cd traceroot # Hosted everything in docker make prod
-
Terraform (AWS): Run TraceRoot on k8s with Helm and Terraform. This is for production hosting. Still in experimental stage.
| Language | Repository |
|---|---|
| Python | traceroot-py |
pip install traceroot openai# Add these in the `.env` file in root directory
TRACEROOT_API_KEY="tr-0f29d..."
TRACEROOT_HOST_URL="https://app.traceroot.ai" # cloud (default)import traceroot
from traceroot import Integration, observe
from openai import OpenAI
traceroot.initialize(integrations=[Integration.OPENAI])
client = OpenAI()
@observe(name="my_agent", type="agent")
def my_agent(query: str) -> str:
response = client.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": query}],
)
return response.choices[0].message.content
if __name__ == "__main__":
my_agent("What's the weather in SF?")Your data security and privacy are our top priorities. Learn more in our Security and Privacy documentation.
Special Thanks for pi-mono project, which powers the foundation of our agentic debugging runtime!
Contributing 🤝: If you're interested in contributing, you can check out our guide here. All types of help are appreciated :)
Support 💬: If you need any type of support, we're typically most responsive on our Discord channel, but feel free to email us founders@traceroot.ai too!
This project is licensed under Apache 2.0 with additional Enterprise features.
