Skip to content

steloit/steloit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

694 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Brokle – The open source platform for AI teams

DocsQuick StartDiscordIssuesWebsite

Debug, evaluate, and optimize your LLM applications with complete visibility. Open source. OpenTelemetry-native. Self-host anywhere.

Quick Start

git clone https://github.com/brokle-ai/brokle.git
cd brokle
make setup && make dev
Service URL
Dashboard http://localhost:3000
API http://localhost:8080

Prerequisites: Docker and Docker Compose

📚 Full setup guide: docs/DEVELOPMENT.md

SDK Integration

Python

pip install brokle
from brokle import Brokle

client = Brokle(api_key="bk_...")

with client.trace("my-agent") as trace:
    response = openai.chat.completions.create(...)

JavaScript/TypeScript

npm install brokle
import { Brokle } from 'brokle';

const client = new Brokle({ apiKey: 'bk_...' });

await client.trace('my-agent', async () => {
  const response = await openai.chat.completions.create(...);
});

OpenTelemetry

export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:8080
export OTEL_EXPORTER_OTLP_HEADERS="x-api-key=bk_..."

Integrations

Framework Status Docs
OpenAI ✅ Native Guide
Anthropic ✅ Native Guide
LangChain ✅ Supported Guide
LlamaIndex ✅ Supported Guide
OpenTelemetry ✅ Native Guide

Features

👁️ Observability

Complete traces of every AI call with latency, token usage, and cost. Debug chains, agents, and complex pipelines step by step.

📊 Evaluation

Automated quality scoring with LLM-as-judge, custom evaluators, and experiments at scale. Define what quality means for your use case.

📝 Prompt Management

Version control for prompts with full history. A/B test variations with real traffic and roll back instantly.

Why Brokle?

  • Open Source – Transparent, extensible, and community-driven
  • OpenTelemetry Native – Built on open standards, no vendor lock-in
  • Self-Host Anywhere – Keep your data on your infrastructure
  • Unified Platform – Observe, evaluate, and manage in one tool

Documentation

Troubleshooting

Port 8080 already in use
lsof -ti:8080 | xargs kill -9
Docker containers not starting
docker-compose down -v
make setup
Database migration errors
make migrate-down
make migrate-up

Need help? Join Discord or open a GitHub Issue.

Contributing

We welcome contributions! See our Contributing Guide to get started.

License

MIT licensed, except for ee/ folders. See LICENSE for details.

Community


If Brokle helps you ship AI, give us a star!

About

The AI engineering platform for AI teams. Observability, evaluation, and prompt management for LLMs and AI agents. OpenTelemetry native.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors