Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exponential Histogram not exposed by the Collector #25146

Closed
fmhwong opened this issue Aug 10, 2023 · 6 comments
Closed

Exponential Histogram not exposed by the Collector #25146

fmhwong opened this issue Aug 10, 2023 · 6 comments
Labels
bug Something isn't working exporter/prometheus

Comments

@fmhwong
Copy link

fmhwong commented Aug 10, 2023

Component(s)

exporter/prometheus

What happened?

Description

Steps to Reproduce

        AutoConfiguredOpenTelemetrySdk sdk = AutoConfiguredOpenTelemetrySdk
                .builder()
                .build();

        DoubleHistogram carSpeedSummary = meter
                .histogramBuilder("car.speedSummary")
                .setDescription("Summary of car speed")
                .setUnit("km/h")
                .build();

I can see the histogram output http://localhost:8889/metrics if I use the default EXPLICIT_BUCKET_HISTOGRAM

If I set environment variable
OTEL_EXPORTER_OTLP_METRICS_DEFAULT_HISTOGRAM_AGGREGATION=BASE2_EXPONENTIAL_BUCKET_HISTOGRAM, I didn't see any histogram output at http://localhost:8889/metrics

The collector-contrib log did show the histogram.

Metric #1
Descriptor:
     -> Name: car.speedSummary
     -> Description: Summary of car speed
     -> Unit: km/h
     -> DataType: ExponentialHistogram
     -> AggregationTemporality: Cumulative
ExponentialHistogramDataPoints #0
Data point attributes:
     -> driver: Str(Don)
StartTimestamp: 2023-08-10 15:49:54.862805415 +0000 UTC
Timestamp: 2023-08-10 15:50:19.84214473 +0000 UTC
Count: 24
Sum: 1959.000000
Min: 0.000000
Max: 129.000000
Bucket [0, 0], Count: 5
Bucket (22.873813, 23.122893], Count: 1
Bucket (23.122893, 23.374685], Count: 0
Bucket (23.374685, 23.629218], Count: 0
Bucket (23.629218, 23.886524], Count: 0
Bucket (23.886524, 24.146631], Count: 0
Bucket (24.146631, 24.409570], Count: 0
Bucket (24.409570, 24.675373], Count: 0
Bucket (24.675373, 24.944070], Count: 0
Bucket (24.944070, 25.215694], Count: 0
Bucket (25.215694, 25.490274], Count: 0
Bucket (25.490274, 25.767845], Count: 0

Expected Result

Expected to see the exponential histogram in the Prometheus endpoint from the Collector.

Actual Result

The histogram was not printed.

Collector version

prometheusexporter@v0.82.0

Environment information

Environment

OS: Docker on Mac OS
Compiler(if manually compiled): (e.g., "go 14.2")

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
        grpc:

exporters:
  prometheus:
    endpoint: "0.0.0.0:8889"
  logging:
    loglevel: debug

processors:
  batch:

service:
  pipelines:
    metrics:
      receivers: [otlp]
      processors: [batch]
      exporters: [prometheus, logging]

Log output

StartTimestamp: 2023-08-10 15:49:54.704889957 +0000 UTC
Timestamp: 2023-08-10 15:50:04.716748635 +0000 UTC
Value: 0
	{"kind": "exporter", "data_type": "metrics", "name": "logging"}
2023-08-10T15:50:04.991Z	error	prometheusexporter@v0.82.0/accumulator.go:94	failed to translate metric	{"kind": "exporter", "data_type": "metrics", "name": "prometheus", "data_type": "\u0004", "metric_name": "car.speedSummary"}
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter.(*lastValueAccumulator).addMetric
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter@v0.82.0/accumulator.go:94
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter.(*lastValueAccumulator).Accumulate
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter@v0.82.0/accumulator.go:71
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter.(*collector).processMetrics
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter@v0.82.0/collector.go:92
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter.(*prometheusExporter).ConsumeMetrics
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/prometheusexporter@v0.82.0/prometheus.go:85
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsRequest).Export
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/metrics.go:54
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/common.go:197
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/queued_retry.go:355
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/metrics.go:125
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).send
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/queued_retry.go:291
go.opentelemetry.io/collector/exporter/exporterhelper.NewMetricsExporter.func2
	go.opentelemetry.io/collector/exporter@v0.82.0/exporterhelper/metrics.go:105
go.opentelemetry.io/collector/consumer.ConsumeMetricsFunc.ConsumeMetrics
	go.opentelemetry.io/collector/consumer@v0.82.0/metrics.go:25
go.opentelemetry.io/collector/internal/fanoutconsumer.(*metricsConsumer).ConsumeMetrics
	go.opentelemetry.io/collector@v0.82.0/internal/fanoutconsumer/metrics.go:69
go.opentelemetry.io/collector/processor/batchprocessor.(*batchMetrics).export
	go.opentelemetry.io/collector/processor/batchprocessor@v0.82.0/batch_processor.go:442
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).sendItems
	go.opentelemetry.io/collector/processor/batchprocessor@v0.82.0/batch_processor.go:256
go.opentelemetry.io/collector/processor/batchprocessor.(*shard).start
	go.opentelemetry.io/collector/processor/batchprocessor@v0.82.0/batch_processor.go:218

Additional context

No response

@fmhwong fmhwong added bug Something isn't working needs triage New item requiring triage labels Aug 10, 2023
@github-actions
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mx-psi
Copy link
Member

mx-psi commented Aug 11, 2023

In case it is useful, the PRW exporter supports exponential histograms, since #17370

@crobert-1
Copy link
Member

@fmhwong Did the prometheus remote write exporter accomplish what you were looking for here, or is there something else needed?

@fmhwong
Copy link
Author

fmhwong commented Sep 13, 2023

@crobert-1 The problem is in the Prometheus exporter. Will that be fixed too?

@crobert-1
Copy link
Member

You're right, this should be investigated further for this exporter, thanks!

@crobert-1
Copy link
Member

@fmhwong It looks like this is a frequency of #13443. There's more discussion there so I'm going to close this issue and comment there as well.

@crobert-1 crobert-1 removed the needs triage New item requiring triage label Oct 27, 2023
@crobert-1 crobert-1 closed this as not planned Won't fix, can't repro, duplicate, stale Oct 27, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working exporter/prometheus
Projects
None yet
Development

No branches or pull requests

3 participants