You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying the LoadBalancer exporter to learn the behavior and to apply OTEL HA setup.
I have OTEL Collector (data producer) running in Kubernetes Cluster with OTLP Exporter sending Logs and Metrics to backend OTEL.
The backend OTEL environment runs in docker-compose with the OTEL Collector (otel-backend) which receives traffic from
the data producer and send this traffic with Load Balancer OTLP Exporter for both Metric and Logs.
When checking the Metric behavior I get errors sending the Metric from the OTEL data producer to the OTEL backend, and no metrics are sent.
When running the Logs Resources, the Logs are sent without any issues or error.
Steps to Reproduce
Send OTEL Metrics to OTEL Colelctor Backend
OTEL Backend has a load balancer exporter that sends Metrics resources to three OTEL Collectors.
Expected Result
Metrics data is sent from OTEL to the backend OTEL Collector and distributed according to the load balancer.
Actual Result
Metrics resources are not sent to OTEL Backend and the following error is printed,
{"kind": "exporter", "data_type": "metrics", "name": "debug/1"}
2024-03-28T14:43:47.632Z error exporterhelper/retry_sender.go:126 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp/4568d48e-9c24-414e-97ec-3cf30282d9d7", "error": "Permanent error: rpc error: code = Unknown desc = unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name", "dropped_items": 116}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/retry_sender.go:126
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/metrics.go:170
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/queue_sender.go:115
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/bounded_memory_queue.go:55
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/consumers.go:43
2024-03-28T14:43:49.415Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "debug/1", "resource metrics": 75, "metrics": 75, "data points": 75}
{"kind": "exporter", "data_type": "metrics", "name": "debug/1"}
2024-03-28T14:43:47.632Z error exporterhelper/retry_sender.go:126 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp/4568d48e-9c24-414e-97ec-3cf30282d9d7", "error": "Permanent error: rpc error: code = Unknown desc = unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name", "dropped_items": 116}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/retry_sender.go:126
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/metrics.go:170
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/queue_sender.go:115
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/bounded_memory_queue.go:55
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/consumers.go:43
2024-03-28T14:43:49.415Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "debug/1", "resource metrics": 75, "metrics": 75, "data points": 75}
The docker-compose layout for the backend service runs 1 otel-collector as the backend and three Otel Collectors as the destination of the backend static load balancer
When changing the OTEL Collector (data producer) to send Metrics to one of the 3 OTEL Collectors (adomock containers) there are no errors in exporting Metrics
The text was updated successfully, but these errors were encountered:
This error is happening because the load balancing exporter is attempting to use the service.name resource attribute of the metric to route the metrics to their specified destination. (service.name is the default routing key for metrics when none is specified, as described in the README). However, your metrics must not have the service.name` resource attribute set, hitting this error.
Do you have a preferred routing key? If not, you could use metric which would simply use the metric name. If you want to keep default behavior, we'd have to investigate why the service.name resource attribute isn't included with your metrics.
You can use a debug exporter instead of the load balancing exporter to show detailed information on the metrics coming through to confirm the resource attribute is missing.
Component(s)
exporter/loadbalancing
What happened?
Description
I am trying the LoadBalancer exporter to learn the behavior and to apply OTEL HA setup.
I have OTEL Collector (data producer) running in Kubernetes Cluster with OTLP Exporter sending Logs and Metrics to backend OTEL.
The backend OTEL environment runs in docker-compose with the OTEL Collector (otel-backend) which receives traffic from
the data producer and send this traffic with Load Balancer OTLP Exporter for both Metric and Logs.
When checking the Metric behavior I get errors sending the Metric from the OTEL data producer to the OTEL backend, and no metrics are sent.
When running the Logs Resources, the Logs are sent without any issues or error.
Steps to Reproduce
Expected Result
Metrics data is sent from OTEL to the backend OTEL Collector and distributed according to the load balancer.
Actual Result
Metrics resources are not sent to OTEL Backend and the following error is printed,
{"kind": "exporter", "data_type": "metrics", "name": "debug/1"}
2024-03-28T14:43:47.632Z error exporterhelper/retry_sender.go:126 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp/4568d48e-9c24-414e-97ec-3cf30282d9d7", "error": "Permanent error: rpc error: code = Unknown desc = unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name; unable to get service name", "dropped_items": 116}
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/retry_sender.go:126
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/metrics.go:170
go.opentelemetry.io/collector/exporter/exporterhelper.(*queueSender).consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/queue_sender.go:115
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue[...]).Consume
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/bounded_memory_queue.go:55
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*QueueConsumers[...]).Start.func1
go.opentelemetry.io/collector/exporter@v0.91.0/exporterhelper/internal/consumers.go:43
2024-03-28T14:43:49.415Z info MetricsExporter {"kind": "exporter", "data_type": "metrics", "name": "debug/1", "resource metrics": 75, "metrics": 75, "data points": 75}
Collector version
opentelemetry-collector-contrib:0.91.0
Environment information
Environment
OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=22.04
DISTRIB_CODENAME=jammy
DISTRIB_DESCRIPTION="Ubuntu 22.04.3 LTS"
OpenTelemetry Collector configuration
Log output
Additional context
Here is the backend OTEL Collector configuration
backend_otel_config.yaml.json
The docker-compose layout for the backend service runs 1 otel-collector as the backend and three Otel Collectors as the destination of the backend static load balancer
docker-compose.yaml.json
Some notes:
The text was updated successfully, but these errors were encountered: