Doku is an open-source LLMOps tool engineered to enables developers with comprehensive capabilities to monitor, analyze, and optimize LLM applications. It provides valuable real-time data on LLM usage, performance, and costs. Through seamless integrations with leading LLM platforms, including OpenAI, Cohere, and Anthropic, Doku acts as a central command center for all your LLM needs. It effectively guides your efforts, ensuring that your LLM applications not only operate at peak efficiency but also scale successfully.
Demo.mp4
Get advanced monitoring and evaluation for your LLM applications with these key benefits:
- Granular Usage Insights of your LLM Applications: Assess your LLM's performance and costs with fine-grained control, breaking down metrics by environment (such as staging or production) or application, to optimize for efficiency and scalability.
- Real-Time Data Streaming: Unlike other platforms where you might wait minutes to see your data due to data being sent in batches, Doku is able to display data as it streams. This immediate insight enables quick decision-making and adjustments.
- Zero Added Latency: Doku's smart data handling ensures rapid data processing without impacting your application's performance, maintaining the responsiveness of your LLM applications.
- Connect to Observability Platforms: Doku seamlessly connects with leading observability platforms like Grafana Cloud and Datadog, among others to automatically export data.
Integrating the dokumetry
SDK into LLM applications is straightforward with SDKs designed for Python and NodeJS. Start monitoring for your LLM Application with just two lines of code:
For Python
import dokumetry
dokumetry.init(llm=openai, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")
For NodeJS
import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})
Once the dokumetry
SDKs are configured in your LLM application, Monitoring data starts streaming to the Doku Ingester. It processes and safely stores your data in ClickHouse, keeping your LLM Monitoring data secure and compliant in your environment.
You can choose to use a new ClickHouse database setup or connect to your existing one to work with Doku.
With your LLM monitoring data processed and securely stored, you can now leverage the Doku UI for in-depth visualization and analysis. Doku UI allows you to explore LLM costs, token usage, performance metrics, and user interactions in an intuitive interface. This powerful tool enhances your ability to observe and optimize your LLM applications, ensuring you make data-driven decisions for improvement.
For those with a preferred observability platform, you can also integrate and visualize this data elsewhere with ease. This flexibility ensures optimal monitoring workflow integration, regardless of your platform choice. For more details on how to set up these connections, check out the Connections guide.
Jumpstart your journey with Doku by deploying it via our Helm chart, designed to simplify the installation process on any Kubernetes cluster.
To install the Doku using Docker, follow these steps:
- Create
docker-compose.yml
version: '3.8'
services:
clickhouse:
image: clickhouse/clickhouse-server:24.2.2
container_name: clickhouse
environment:
CLICKHOUSE_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
CLICKHOUSE_USER: ${DOKU_DB_USER:-default}
volumes:
- clickhouse-data:/var/lib/clickhouse
ports:
- "9000:9000"
- "8123:8123"
restart: always
doku-ingester:
image: ghcr.io/dokulabs/doku-ingester:latest
container_name: doku-ingester
environment:
DOKU_DB_HOST: clickhouse
DOKU_DB_PORT: 9000
DOKU_DB_NAME: ${DOKU_DB_NAME:-default}
DOKU_DB_USER: ${DOKU_DB_USER:-default}
DOKU_DB_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
ports:
- "9044:9044"
depends_on:
- clickhouse
restart: always
doku-client:
image: ghcr.io/dokulabs/doku-client:latest
container_name: doku-client
environment:
INIT_DB_HOST: clickhouse
INIT_DB_PORT: 8123
INIT_DB_DATABASE: ${DOKU_DB_NAME:-default}
INIT_DB_USERNAME: ${DOKU_DB_USER:-default}
INIT_DB_PASSWORD: ${DOKU_DB_PASSWORD:-DOKU}
SQLITE_DATABASE_URL: file:/app/client/data/data.db
ports:
- "3000:3000"
depends_on:
- clickhouse
volumes:
- doku-client-data:/app/client/data
restart: always
volumes:
clickhouse-data:
doku-client-data:
- Start Docker Compose
docker-compose up -d
To install the Doku Helm chart, follow these steps:
- Add the Doku Helm repository to your Helm setup:
helm repo add dokulabs https://dokulabs.github.io/helm/
- Update your Helm repositories to fetch the latest chart information:
helm repo update
- Install the Doku chart with the release name
doku
:
helm install doku dokulabs/doku
For a detailed list of configurable parameters for the Helm chart, refer to the values.yaml
file in the Helm chart.
With Doku running, the next step is to access the Doku UI and generate an API key for secure communication between your applications and Doku.
- Open your browser and go to Doku UI at
127.0.0.1:3000/login
- Login using theb default credentials
- Email as
user@dokulabs.com
- Password as
dokulabsuser
- Email as
- Once you have logged into Doku UI, Go to API Keys page and Create an API Key. Copy the generated API Key.
π‘ Tip: Alternatively, you can use the HTTP API to create your Doku API Key. For further details, take a look at the API Reference section.
Choose the appropriate SDK for your LLM application's programming language and follow the steps to integrate monitoring with just two lines of code.
Install the dokumetry
Python SDK using pip:
pip install dokumetry
Add the following two lines to your application code:
import dokumetry
dokumetry.init(llm=client, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")
from openai import OpenAI
import dokumetry
client = OpenAI(
api_key="YOUR_OPENAI_KEY"
)
# Pass the above `client` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
dokumetry.init(llm=client, doku_url="YOUR_DOKU_INGESTER_URL", api_key="YOUR_DOKU_TOKEN")
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "What is LLM Observability",
}
],
model="gpt-3.5-turbo",
)
Refer to the dokumetry
Python SDK repository for more advanced configurations and use cases.
Install the dokumetry
NodeJS SDK using npm:
npm install dokumetry
Add the following two lines to your application code:
import DokuMetry from 'dokumetry';
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})
import OpenAI from 'openai';
import DokuMetry from 'dokumetry';
const openai = new OpenAI({
apiKey: 'My API Key', // defaults to process.env["OPENAI_API_KEY"]
});
// Pass the above `openai` object along with your Doku Ingester URL and API key and this will make sure that all OpenAI calls are automatically tracked.
DokuMetry.init({llm: openai, dokuUrl: "YOUR_DOKU_INGESTER_URL", apiKey: "YOUR_DOKU_TOKEN"})
async function main() {
const chatCompletion = await openai.chat.completions.create({
messages: [{ role: 'user', content: 'What are the key to effective observability?' }],
model: 'gpt-3.5-turbo',
});
}
main();
Refer to the dokumetry
NodeJS SDK repository for more advanced configurations and use cases.
Once you have Doku Ingester and DokuMetry
SDKs set up, you can instantly get insights into how your LLM applications in the Doku Client UI. Just head over to 127.0.0.1:3000
on your browser to start exploring.
With Doku, you get a simple, powerful view into important info like how much youβre spending on LLMs, which parts of your app are using them the most, and how well theyβre performing. Find out which LLM models are favorites among your applications, and dive deep into performance details to make smart decisions. This setup is perfect for optimizing your app performance and keeping an eye on costs.
Doku uses key based authentication mechanism to ensure the security of your data and as Doku is self-hosted, The data stays within your environment.
We welcome contributions to the Doku project. Please refer to CONTRIBUTING for detailed guidelines on how you can participate.
Doku is available under the Apache-2.0 license.
For support, issues, or feature requests, submit an issue through the GitHub issues associated with this repository.
Join us on this voyage to reshape the future of AI Observability. Share your thoughts, suggest features, and explore contributions. Engage with us on GitHub and be part of Doku's community-led innovation.