diff --git a/observability/otel/otel-collector/otel-collector-exporters.adoc b/observability/otel/otel-collector/otel-collector-exporters.adoc index fb7038048202..1a3fd2732d02 100644 --- a/observability/otel/otel-collector/otel-collector-exporters.adoc +++ b/observability/otel/otel-collector/otel-collector-exporters.adoc @@ -8,6 +8,184 @@ toc::[] Exporters send data to one or more back ends or destinations. An exporter can be push or pull based. By default, no exporters are configured. One or more exporters must be configured. Exporters can support one or more data sources. Exporters might be used with their default settings, but many exporters require configuration to specify at least the destination and security settings. +[id="kafka-exporter_{context}"] +== Kafka Exporter + +The Kafka Exporter exports logs, metrics, and traces to Kafka. This exporter uses a synchronous producer that blocks and does not batch messages. You must use it with batch and queued retry processors for higher throughput and resiliency. + +:FeatureName: The Kafka Exporter +include::snippets/technology-preview.adoc[] + +.OpenTelemetry Collector custom resource with an enabled Kafka Exporter +[source,yaml] +---- +# ... + config: | + exporters: + kafka: + brokers: ["localhost:9092"] + protocol_version: 2.0.0 + topic: otlp_spans + auth: + plain_text: + username: example + password: example + tls: + ca_file: ca.pem + cert_file: cert.pem + key_file: key.pem + insecure: false + server_name_override: kafka.example.corp + retry_on_failure: + enabled: true + initial_interval: 5s + max_interval: 30s + max_elapsed_time: 300s + sending_queue: + enabled: true + num_consumers: 10 + queue_size: 1000 + producer: + max_message_bytes: 1000000 + required_acks: 1 + compression: none + flush_max_messages: 0 + service: + pipelines: + traces: + exporters: [kafka] +# ... +---- + +.Parameters used by the Kafka Exporter +[options="header"] +[cols="a,a,a"] +|=== +|Parameter |Description |Default + +|`brokers` +|The list of Kafka brokers +|`["localhost:9092"]` + +|`protocol_version` +|The Kafka protocol version. This is a required field +|`2.0.0` + +|`topic` +|The name of the Kafka topic to read from. +|`otlp_spans` for traces, `otlp_metrics` for metrics, and `otlp_logs` for logs + +|`auth.plain_text` +|The plain text authentication configuration. If omitted, plain text authentication is disabled +|N/A + +|`auth.tls` +|The client-side TLS configuration, defining paths to the TLS certificates. If omitted, TLS authentication is disabled +|N/A + +|`auth.tls.insecure` +|Disables verifying the server's certificate chain and host name +|`false` + +|`auth.tls.server_name_override` +|ServerName indicates the name of the server requested by the client to support virtual hosting +|`kafka.example.corp` + +|`retry_on_failure.enabled` +|Settings for the retry mechanism +|`true` + +|`retry_on_failure.initial_interval` +|Time to wait after the first failure before retrying +|`5s` + +|`retry_on_failure.max_interval` +|Upper bound on backoff +|`30s` + +|`retry_on_failure.max_elapsed_time` +|Maximum time spent trying to send a batch. A value of `0` means retries never stop +|`300s` + +|`sending_queue.enabled` +|Settings for the sending queue +|`true` + +|`sending_queue.num_consumers` +|Number of consumers that dequeue batches +|`10` + +|`sending_queue.queue_size` +|Maximum number of batches kept in memory before dropping +|`1000` + +|`producer.max_message_bytes` +|The maximum allowed message size in bytes +|`1000000` + +|`producer.required_acks` +|Determines when a message is considered transmitted +|`1` + +|`producer.compression` +|Specifies the compression method used when producing messages to Kafka. Options include `none`, `gzip`, `snappy`, `lz4`, and `zstd` +|`none` + +|`producer.flush_max_messages` +|The maximum number of messages the producer can send in a single request to the broker +|`0` +|=== + +=== Troubleshooting the Kafka Exporter + +If you encounter issues with the Kafka Exporter, the following troubleshooting steps can help diagnose and resolve common problems. + +==== Messages fail to reach the Kafka topic + +If messages are not being exported to the Kafka topic, check the following: + +.Procedure + +- Verify Kafka broker connection: ensure the `brokers` parameter is correctly configured with the correct Kafka broker addresses. Confirm that the Kafka broker is reachable from the OpenTelemetry Collector instance. ++ +- Check the `topic`: ensure the Kafka topic name specified in the `topic` field matches an existing topic in the Kafka server. The default topics are `otlp_spans`, `otlp_metrics`, and `otlp_logs`. ++ +- Ensure correct pipeline setup: confirm that the Kafka Exporter is properly configured in the `service.pipelines` section of the OpenTelemetry Collector configuration. + +==== High latency or performance issues + +If the Kafka Exporter is experiencing high latency or performance issues, consider the following: + +.Procedure + +- Review the `sending_queue.num_consumers` parameter value: increasing the number of consumers can improve throughput if the default value is insufficient for the telemetry data volume. ++ +- Check `queue_size`: ensure that the `sending_queue.queue_size` is large enough to handle the volume of telemetry data without dropping messages. ++ +- Optimize retry settings: adjust the `retry_on_failure` parameters to optimize the retry intervals (`initial_interval`, `max_interval`) and the total time spent retrying (`max_elapsed_time`). Higher intervals may contribute to increased latency if retries are frequent. + +==== Authentication failures + +If authentication errors occur, verify the following: + +.Procedure + +- Plain text authentication configuration: if using plain text authentication, ensure that the `auth.plain_text.username` and `auth.plain_text.password` fields are correctly configured with valid credentials. ++ +- TLS configuration: if using TLS, confirm that the paths to the TLS certificates (`auth.tls.ca_file`, `auth.tls.cert_file`, `auth.tls.key_file`) are correct and accessible. Verify that `auth.tls.insecure` is set to `false` if you want to enforce certificate validation. ++ +- Server name mismatch: ensure that `auth.tls.server_name_override` is set to the correct server name. A mismatch between the server name and the certificate can cause authentication errors. + +==== Message size exceeds limit + +If messages fail to send because they exceed the allowed size, check the following: + +.Procedure + +- Increase `max_message_bytes`: the `producer.max_message_bytes` parameter controls the maximum size of messages sent to Kafka. Increase this value if your telemetry data requires larger message sizes. The default is 1 MB (`1000000` bytes). ++ +- Enable compression: consider setting `producer.compression` to a compression algorithm like `gzip` or `lz4` to reduce the size of messages sent to Kafka. + [id="otlp-exporter_{context}"] == OTLP Exporter @@ -231,48 +409,6 @@ include::snippets/technology-preview.adoc[] * You must enable the `--web.enable-remote-write-receiver` feature flag on the remote Prometheus instance. Without it, pushing the metrics to the instance using this exporter fails. ==== -[id="kafka-exporter_{context}"] -== Kafka Exporter - -The Kafka Exporter exports logs, metrics, and traces to Kafka. This exporter uses a synchronous producer that blocks and does not batch messages. You must use it with batch and queued retry processors for higher throughput and resiliency. - -:FeatureName: The Kafka Exporter -include::snippets/technology-preview.adoc[] - -.OpenTelemetry Collector custom resource with an enabled Kafka Exporter -[source,yaml] ----- -# ... - config: | - exporters: - kafka: - brokers: ["localhost:9092"] # <1> - protocol_version: 2.0.0 # <2> - topic: otlp_spans # <3> - auth: - plain_text: # <4> - username: example - password: example - tls: # <5> - ca_file: ca.pem - cert_file: cert.pem - key_file: key.pem - insecure: false # <6> - server_name_override: kafka.example.corp # <7> - service: - pipelines: - traces: - exporters: [kafka] -# ... ----- -<1> The list of Kafka brokers. The default is `+localhost:9092+`. -<2> The Kafka protocol version. For example, `+2.0.0+`. This is a required field. -<3> The name of the Kafka topic to read from. The following are the defaults: `+otlp_spans+` for traces, `+otlp_metrics+` for metrics, `+otlp_logs+` for logs. -<4> The plain text authentication configuration. If omitted, plain text authentication is disabled. -<5> The client-side TLS configuration. Defines paths to the TLS certificates. If omitted, TLS authentication is disabled. -<6> Disables verifying the server's certificate chain and host name. The default is `+false+`. -<7> ServerName indicates the name of the server requested by the client to support virtual hosting. - [role="_additional-resources"] [id="additional-resources_otel-collector-exporters_{context}"] == Additional resources