From 1f9c4cbf24354ab34c705af6000d4889310912b1 Mon Sep 17 00:00:00 2001 From: Vincent Koc Date: Wed, 1 Oct 2025 17:45:13 -0700 Subject: [PATCH 1/2] Update opentelemetry.mdx --- server/utilities/opentelemetry.mdx | 35 ++++++++++++++++++++++++++++++ 1 file changed, 35 insertions(+) diff --git a/server/utilities/opentelemetry.mdx b/server/utilities/opentelemetry.mdx index 2779851..d5ccb07 100644 --- a/server/utilities/opentelemetry.mdx +++ b/server/utilities/opentelemetry.mdx @@ -151,6 +151,41 @@ exporter = OTLPSpanExporter( See our [Langfuse example](https://github.com/pipecat-ai/pipecat-examples/tree/main/open-telemetry/langfuse) for details on configuring this exporter. +### Opik (LLM observability) + +[Opik](https://www.comet.com/opik/) is an observability, evaluation, and optimization platform for LLM and agent workloads. Pipecat's span hierarchy maps directly onto Opik's trace explorer, allowing you to replay conversations, inspect service-level metrics, and monitor latency and cost for every turn. + +1. Make sure the HTTP OTLP exporter is available (installed automatically with `pipecat-ai[tracing]`). +2. Configure your environment variables for the Opik deployment you use: + +```bash wordWrap +# Opik Cloud +export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel +export OTEL_EXPORTER_OTLP_HEADERS='Authorization=,Comet-Workspace=,projectName=' + +# Opik Enterprise +export OTEL_EXPORTER_OTLP_ENDPOINT=https:///opik/api/v1/private/otel +export OTEL_EXPORTER_OTLP_HEADERS='Authorization=,Comet-Workspace=,projectName=' + +# Self-hosted Opik +export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel +export OTEL_EXPORTER_OTLP_HEADERS='projectName=' +``` + +3. Use the HTTP exporter without additional parameters—the headers and endpoint are picked up from the environment: + +```python +from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter +from pipecat.utils.tracing.setup import setup_tracing + +setup_tracing( + service_name="pipecat-opik-demo", + exporter=OTLPSpanExporter(), +) +``` + +Opik automatically groups spans by `conversation_id`, enriches them with per-service metrics (LLM token usage, TTS character counts, STT transcripts, TTFB), and records any errors raised by your pipeline. Visit the [Pipecat + Opik guide](https://www.comet.com/docs/opik/tracing/integrations/pipecat) for screenshots and a full walkthrough. + ### Console Exporter (for debugging) The console exporter can be enabled alongside any other exporter by setting `console_export=True`: From 7dc5b50863ae06d0abcfebc90c84a20c1d4e33fb Mon Sep 17 00:00:00 2001 From: Vincent Koc Date: Fri, 3 Oct 2025 08:32:30 -0700 Subject: [PATCH 2/2] Update opentelemetry.mdx --- server/utilities/opentelemetry.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/server/utilities/opentelemetry.mdx b/server/utilities/opentelemetry.mdx index d5ccb07..9943da6 100644 --- a/server/utilities/opentelemetry.mdx +++ b/server/utilities/opentelemetry.mdx @@ -153,7 +153,7 @@ See our [Langfuse example](https://github.com/pipecat-ai/pipecat-examples/tree/m ### Opik (LLM observability) -[Opik](https://www.comet.com/opik/) is an observability, evaluation, and optimization platform for LLM and agent workloads. Pipecat's span hierarchy maps directly onto Opik's trace explorer, allowing you to replay conversations, inspect service-level metrics, and monitor latency and cost for every turn. +[Opik](https://www.comet.com/opik/) is an observability, evaluation, and optimization platform for LLM and agent workloads. Pipecat's span hierarchy maps directly onto Opik's trace explorer, allowing you to replay conversations, inspect service-level metrics, and monitor latency and cost for every turn. See our [Opik example](https://github.com/pipecat-ai/pipecat-examples/tree/main/open-telemetry/opik) for a reference implementation. 1. Make sure the HTTP OTLP exporter is available (installed automatically with `pipecat-ai[tracing]`). 2. Configure your environment variables for the Opik deployment you use: