Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prometheusremotewrite empty metrics error #10364

Closed
jjcollinge opened this issue May 26, 2022 · 5 comments · Fixed by #12489
Closed

Prometheusremotewrite empty metrics error #10364

jjcollinge opened this issue May 26, 2022 · 5 comments · Fixed by #12489
Labels
bug Something isn't working comp: exporter Exporter comp:prometheus Prometheus related issues priority:p3 Lowest

Comments

@jjcollinge
Copy link

Describe the bug
When a receiver scrape returns an empty metric and the prometheusremotewrite exporter is enabled, the exporter errors instead of just dropping the metric. When retriers are enabled, these errors block valid metrics getting exported and cause our entire system to degrade. If I disable retries, the other valid metrics get through ok but then we get the error Exporting failed: Try enabling retry_on_failure config option to retry on retryable errors from queued_retry.go.

The following scrapers return empty metrics when there is nothing to scrape which results in this issue:

Steps to reproduce
Use either podmanreceiver or dockerstatsreceiver in an environment where there are no containers.

What did you expect to see?
Empty metrics are dropped and an error is not reported that affects the exporting of valid metrics.

What did you see instead?
Errors from the remotewriteexporter that cause other valid metrics to not be exported correctly due to the intensive retries.

What version did you use?
v0.52

@jjcollinge jjcollinge added the bug Something isn't working label May 26, 2022
@jjcollinge
Copy link
Author

I tried this with codegen based receivers and get the same error when in an environment where no metrics can be recorded.

@dmitryax dmitryax added comp:prometheus Prometheus related issues comp: exporter Exporter labels May 27, 2022
@dmitryax
Copy link
Member

It looks like collector cannot reach the prometheus server. Can you confirm the exporter configured correctly and the prom server is accessible from there?

@dmitryax dmitryax added the priority:p3 Lowest label May 27, 2022
@jjcollinge
Copy link
Author

jjcollinge commented May 27, 2022

Thanks for the response @dmitryax.

So the error we get from the prometheusremotewrite exporter when running with no containers and without disabling the retries is:

2022-04-26T05:47:01.878Z	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "188.256732ms"}

In the promethuesremotewrite exporter code you can see this error occurs in the batching before it attempts to send the data.

We do also get some metrics that are being collected by other receivers (that do have data) that are also exported via the same prometheusremotewrite exporter so we know the connection to cortex is working. The prometheusremotewrite exporter also works perfectly with the same config when there are running containers that it can get metrics from. We are currently having to deploy a dummy container to ensure that some fake metrics are available to avoid this error and this allows everything to work as expected but isn't ideal.

@jjcollinge
Copy link
Author

jjcollinge commented May 31, 2022

btw this can be reproduced easily using your latest distributions...

Using the v0.52.0 release binary and the following config (I'm excluding all my running docker images to get 0 metrics):

extensions:
  health_check:

receivers:
  docker_stats:
    endpoint: unix:///var/run/docker.sock
    collection_interval: 2s
    timeout: 20s
    api_version: 1.24
    excluded_images:
      - prom/prometheus
      - grafana/grafana
      - openzipkin/zipkin
      - daprio/dapr:1.7.2
      - redis
    provide_per_core_cpu_metrics: true

processors:

exporters:
  prometheusremotewrite:
    endpoint: "http://127.0.0.1:8080/api/v1/push"

service:
  telemetry:
    logs:
      level: debug
  extensions: [health_check]
  pipelines:
    metrics:
      receivers: [docker_stats]
      processors: []
      exporters: [prometheusremotewrite]

This results in the following log:

./otelcol-contrib --config cfg/config.yaml                                 22:35:21
2022/05/31 22:35:35 proto: duplicate proto type registered: jaeger.api_v2.PostSpansRequest
2022/05/31 22:35:35 proto: duplicate proto type registered: jaeger.api_v2.PostSpansResponse
2022-05-31T22:35:35.426+0100	info	builder/exporters_builder.go:255	Exporter was built.	{"kind": "exporter", "name": "prometheusremotewrite"}
2022-05-31T22:35:35.427+0100	info	builder/pipelines_builder.go:224	Pipeline was built.	{"kind": "pipeline", "name": "metrics"}
2022-05-31T22:35:35.427+0100	info	builder/receivers_builder.go:225	Receiver was built.	{"kind": "receiver", "name": "docker_stats", "datatype": "metrics"}
2022-05-31T22:35:35.427+0100	info	service/telemetry.go:110	Setting up own telemetry...
2022-05-31T22:35:35.427+0100	info	service/telemetry.go:130	Serving Prometheus metrics	{"address": ":8888", "level": "basic", "service.instance.id": "12c5aca1-356a-4084-8951-8fa80f3c1e13", "service.version": "latest"}
2022-05-31T22:35:35.427+0100	info	service/service.go:74	Starting extensions...
2022-05-31T22:35:35.427+0100	info	extensions/extensions.go:38	Extension is starting...	{"kind": "extension", "name": "health_check"}
2022-05-31T22:35:35.427+0100	info	healthcheckextension@v0.52.0/healthcheckextension.go:44	Starting health_check extension	{"kind": "extension", "name": "health_check", "config": {"Port":0,"TCPAddr":{"Endpoint":"0.0.0.0:13133"},"Path":"/","CheckCollectorPipeline":{"Enabled":false,"Interval":"5m","ExporterFailureThreshold":5}}}
2022-05-31T22:35:35.428+0100	info	extensions/extensions.go:42	Extension started.	{"kind": "extension", "name": "health_check"}
2022-05-31T22:35:35.428+0100	info	service/service.go:79	Starting exporters...
2022-05-31T22:35:35.428+0100	info	builder/exporters_builder.go:40	Exporter is starting...	{"kind": "exporter", "name": "prometheusremotewrite"}
2022-05-31T22:35:35.429+0100	info	builder/exporters_builder.go:48	Exporter started.	{"kind": "exporter", "name": "prometheusremotewrite"}
2022-05-31T22:35:35.429+0100	info	service/service.go:84	Starting processors...
2022-05-31T22:35:35.429+0100	info	builder/pipelines_builder.go:54	Pipeline is starting...	{"kind": "pipeline", "name": "metrics"}
2022-05-31T22:35:35.429+0100	info	builder/pipelines_builder.go:65	Pipeline is started.	{"kind": "pipeline", "name": "metrics"}
2022-05-31T22:35:35.429+0100	info	service/service.go:89	Starting receivers...
2022-05-31T22:35:35.429+0100	info	builder/receivers_builder.go:67	Receiver is starting...	{"kind": "receiver", "name": "docker_stats"}
2022-05-31T22:35:35.440+0100	debug	docker@v0.52.0/docker.go:121	Not monitoring container per ExcludedImages	{"kind": "receiver", "name": "docker_stats", "image": "grafana/grafana", "id": "62c115dad4a38987c59f1762616b03c9d4bdb46c6c1dc57bb0a2f6833ea1dfd9"}
2022-05-31T22:35:35.440+0100	debug	docker@v0.52.0/docker.go:121	Not monitoring container per ExcludedImages	{"kind": "receiver", "name": "docker_stats", "image": "redis", "id": "bc631a9c8baf9e08cb65512f9662f7e06998f8862626cda0a9f0a0b389bc9cb6"}
2022-05-31T22:35:35.440+0100	debug	docker@v0.52.0/docker.go:121	Not monitoring container per ExcludedImages	{"kind": "receiver", "name": "docker_stats", "image": "daprio/dapr:1.7.2", "id": "c600f3fe8526577f36436a14d1b749b5ceff90c3769d0f49b6cbd75e4d6e814c"}
2022-05-31T22:35:35.440+0100	debug	docker@v0.52.0/docker.go:121	Not monitoring container per ExcludedImages	{"kind": "receiver", "name": "docker_stats", "image": "openzipkin/zipkin", "id": "1ad0d1ffaab04454bca7a94016054670848063d656e54132fdc10e6c66906964"}
2022-05-31T22:35:35.440+0100	debug	docker@v0.52.0/docker.go:121	Not monitoring container per ExcludedImages	{"kind": "receiver", "name": "docker_stats", "image": "prom/prometheus", "id": "9b5210b2a2f2af7aa7da7821f1cba1d40b0ea1091fe8d11c53669a5700927beb"}
2022-05-31T22:35:35.441+0100	info	builder/receivers_builder.go:72	Receiver started.	{"kind": "receiver", "name": "docker_stats"}
2022-05-31T22:35:35.441+0100	info	healthcheck/handler.go:129	Health Check state change	{"kind": "extension", "name": "health_check", "status": "ready"}
2022-05-31T22:35:35.441+0100	info	service/collector.go:248	Starting otelcol-contrib...	{"Version": "0.52.0", "NumCPU": 10}
2022-05-31T22:35:35.441+0100	info	service/collector.go:144	Everything is ready. Begin running and processing data.
2022-05-31T22:35:37.445+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "53.575355ms"}
2022-05-31T22:35:37.500+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "70.789629ms"}
2022-05-31T22:35:37.572+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "66.440053ms"}
2022-05-31T22:35:37.640+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "132.0604ms"}
2022-05-31T22:35:37.773+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "105.28313ms"}
2022-05-31T22:35:37.880+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "149.057923ms"}
2022-05-31T22:35:38.030+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "114.760814ms"}
2022-05-31T22:35:38.146+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "212.620673ms"}
2022-05-31T22:35:38.360+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "104.329226ms"}
2022-05-31T22:35:38.465+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "174.522216ms"}
2022-05-31T22:35:38.642+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "123.691334ms"}
2022-05-31T22:35:38.767+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "103.515069ms"}
2022-05-31T22:35:38.871+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "190.324465ms"}
2022-05-31T22:35:39.062+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "106.000961ms"}
2022-05-31T22:35:39.169+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "182.791997ms"}
2022-05-31T22:35:39.352+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "229.860043ms"}
2022-05-31T22:35:39.584+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "250.132392ms"}
2022-05-31T22:35:39.838+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "273.484545ms"}
2022-05-31T22:35:40.113+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "272.383666ms"}
2022-05-31T22:35:40.386+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "293.111353ms"}
2022-05-31T22:35:40.681+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "100.292357ms"}
2022-05-31T22:35:40.783+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "211.789641ms"}
2022-05-31T22:35:40.996+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "126.015395ms"}
2022-05-31T22:35:41.123+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "124.359096ms"}
2022-05-31T22:35:41.249+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "147.609311ms"}
2022-05-31T22:35:41.398+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "246.29197ms"}
2022-05-31T22:35:41.646+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "220.916805ms"}
2022-05-31T22:35:41.868+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "187.435165ms"}
2022-05-31T22:35:42.057+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "125.486381ms"}
2022-05-31T22:35:42.184+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "102.651058ms"}
2022-05-31T22:35:42.288+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "228.831757ms"}
2022-05-31T22:35:42.518+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "185.130196ms"}
2022-05-31T22:35:42.704+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "205.365997ms"}
2022-05-31T22:35:42.910+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "277.545787ms"}
2022-05-31T22:35:43.189+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "268.919005ms"}
2022-05-31T22:35:43.459+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "237.583811ms"}
2022-05-31T22:35:43.698+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "118.56208ms"}
2022-05-31T22:35:43.819+0100	info	exporterhelper/queued_retry.go:215	Exporting failed. Will retry the request after interval.	{"kind": "exporter", "name": "prometheusremotewrite", "error": "invalid tsMap: cannot be empty map", "interval": "186.433287ms"}
...

This config uses the dockerstatsreceiver but I've had the same using any receiver where I can get it to return empty metrics.

@jjcollinge
Copy link
Author

@dmitryax if you can confirm this is a bug I'd be happy to create a PR to "fix" this. Is there a reason the prometheusremotewrite export couldn't just debug log that the tsMap is empty and then return nil?

gouthamve referenced this issue in gouthamve/opentelemetry-collector-contrib Jul 15, 2022
Fixes #10364

Signed-off-by: Goutham Veeramachaneni <gouthamve@gmail.com>
bogdandrutu added a commit that referenced this issue Aug 22, 2022
Fixes #10364

Signed-off-by: Goutham Veeramachaneni <gouthamve@gmail.com>

Signed-off-by: Goutham Veeramachaneni <gouthamve@gmail.com>
Co-authored-by: Bogdan Drutu <bogdandrutu@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working comp: exporter Exporter comp:prometheus Prometheus related issues priority:p3 Lowest
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants