feat: Add OpenTelemetry Analytics Provider#6117
feat: Add OpenTelemetry Analytics Provider#6117rohan-patil2 wants to merge 1 commit intoFlowiseAI:mainfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces OpenTelemetry (OTEL) integration for analytics, enabling the export of telemetry data to OTLP-compliant backends. Key additions include a new credential type with vendor presets, an OTEL analytic node, and a robust backend implementation featuring a tracer provider pool with LRU eviction and a LangChain callback handler. The integration is supported by new UI configuration components and extensive unit and integration tests. Feedback identifies an issue in the analytic handler where tool outputs are double-encoded as JSON and MIME types are incorrectly hardcoded, which should be corrected to ensure accurate data representation.
| if (isRetrieval) { | ||
| const outputStr = typeof output === 'string' ? output : JSON.stringify(output) | ||
| let numResults: number | undefined | ||
| try { | ||
| const parsed = typeof output === 'string' ? JSON.parse(output) : output | ||
| if (Array.isArray(parsed)) { | ||
| numResults = parsed.length | ||
| } | ||
| } catch { | ||
| // output is not parseable; leave numResults undefined | ||
| } | ||
|
|
||
| toolSpan.setAttribute('output.value', outputStr) | ||
| toolSpan.setAttribute('output.mime_type', 'application/json') | ||
| toolSpan.setAttribute('retrieval.documents', outputStr) | ||
| if (numResults !== undefined) { | ||
| toolSpan.setAttribute('retrieval.num_results', numResults) | ||
| } | ||
| if (startTime !== undefined) { | ||
| toolSpan.setAttribute('retrieval.latency_ms', Date.now() - startTime) | ||
| } | ||
| delete this.handlers['openTelemetry'].retrievalSpanIds?.[spanId] | ||
| } else { | ||
| toolSpan.setAttribute('output.value', JSON.stringify(output)) | ||
| toolSpan.setAttribute('output.mime_type', 'application/json') | ||
| toolSpan.setAttribute('tool.output', JSON.stringify(output)) | ||
| if (startTime !== undefined) { | ||
| toolSpan.setAttribute('tool.latency_ms', Date.now() - startTime) | ||
| } | ||
| } |
There was a problem hiding this comment.
The current implementation for handling tool output incorrectly uses JSON.stringify(output) even when output is already a string, which leads to double-encoded JSON. Additionally, the mime_type is hardcoded to application/json, which is incorrect for plain string outputs. This can be fixed by checking the type of output and setting the outputStr and mimeType accordingly.
const isString = typeof output === 'string'
const outputStr = isString ? output : JSON.stringify(output)
const mimeType = isString ? 'text/plain' : 'application/json'
if (isRetrieval) {
let numResults: number | undefined
try {
const parsed = isString ? JSON.parse(output) : output
if (Array.isArray(parsed)) {
numResults = parsed.length
}
} catch {
// output is not a valid JSON string; leave numResults undefined
}
toolSpan.setAttribute('output.value', outputStr)
toolSpan.setAttribute('output.mime_type', mimeType)
toolSpan.setAttribute('retrieval.documents', outputStr)
if (numResults !== undefined) {
toolSpan.setAttribute('retrieval.num_results', numResults)
}
if (startTime !== undefined) {
toolSpan.setAttribute('retrieval.latency_ms', Date.now() - startTime)
}
delete this.handlers['openTelemetry'].retrievalSpanIds?.[spanId]
} else {
toolSpan.setAttribute('output.value', outputStr)
toolSpan.setAttribute('output.mime_type', mimeType)
toolSpan.setAttribute('tool.output', outputStr)
if (startTime !== undefined) {
toolSpan.setAttribute('tool.latency_ms', Date.now() - startTime)
}
}
Summary
This PR adds OpenTelemetry (OTEL) analytics provider to Flowise, enabling users to export distributed traces from chatflow and agentflow executions to any OTLP-compatible observability backend (New Relic, Datadog, Grafana Cloud, or any custom endpoint).
The integration follows OpenInference semantic conventions for LLM observability and provides full lifecycle tracing including LLM calls, tool invocations, retriever operations, chain executions, and agent actions; with token usage, latency metrics, and rich span attributes.
Motivation
Flowise already supports analytics providers like LangSmith, Langfuse, LunaryAI, and Opik, but lacked support for the vendor-neutral OpenTelemetry standard. Many enterprise teams already run OTLP-compatible backends (New Relic, Datadog, Grafana Tempo, etc.) and need a way to funnel LLM observability data into their existing infrastructure without adopting yet another proprietary tool.
This enhancement fills that gap by adding OpenTelemetry to existing providers across the full stack: credential management, UI configuration, backend handler lifecycle, and per-chatflow tracer pooling.
Provider Setup:

Credential Setup:
