Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

500 Server Error when select series #5133

Closed
yeya24 opened this Issue Jan 24, 2019 · 8 comments

Comments

Projects
None yet
4 participants
@yeya24
Copy link

yeya24 commented Jan 24, 2019

Bug Report

What did you do?

In the web UI, I search some metrics of prometheus such as prometheus_build_info or some other metrics of prometheus It returns a 500 error.
But when I search the metrics from influxdb(through remote read), I can get the correct metrics.

What did you expect to see?

I can see the correct metrics in the console

What did you see instead? Under which circumstances?
_ _20190124152030

Environment

  • System information:

Linux 3.10.0-862.14.4.el7.x86_64 x86_64

  • Prometheus version:
    I tested in prometheus 2.4.0 and prometheus 2.5.0, all met the same problem
  • Prometheus configuration file:
# my global config
global:
  scrape_interval:     15s # Set the scrape interval to every 15 seconds. Default is every 1 minute.
  evaluation_interval: 15s # Evaluate rules every 15 seconds. The default is every 1 minute.
  # scrape_timeout is set to the global default (10s).

# Alertmanager configuration
alerting:
  alertmanagers:
  - static_configs:
    - targets:
      # - alertmanager:9093

# Load rules once and periodically evaluate them according to the global 'evaluation_interval'.
rule_files:
  # - "first_rules.yml"
  # - "second_rules.yml"

# A scrape configuration containing exactly one endpoint to scrape:
# Here it's Prometheus itself.
scrape_configs:
  # The job name is added as a label `job=<job_name>` to any timeseries scraped from this config.
  - job_name: 'prometheus'

    # metrics_path defaults to '/metrics'
    # scheme defaults to 'http'.

    static_configs:
    - targets: ['localhost:9090']
remote_read:
  - url: "http://192.168.200.31:8086/api/v1/prom/read?db=stress"
  • Logs:
level=error ts=2019-01-24T06:37:01.016535242Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:02.308943035Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:42.932953428Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:43.200055227Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:43.436832374Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:43.67769608Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:37:43.908430309Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
level=error ts=2019-01-24T06:39:00.73227196Z caller=engine.go:498 component="query engine" msg="error selecting series set" err="server returned HTTP status 500 Internal Server Error"
@cstyan

This comment has been minimized.

Copy link
Contributor

cstyan commented Jan 24, 2019

@yeya24 that error is from the remote read request from prometheus to influxdb

@aixeshunter

This comment has been minimized.

Copy link
Contributor

aixeshunter commented Jan 25, 2019

Hi, could u see the influxdb logs?

@yeya24

This comment has been minimized.

Copy link
Author

yeya24 commented Jan 25, 2019

I found that maybe an error from influxdb because it throw an error of nil pointer. Here is the logs

[httpd] 172.17.0.1 - - [25/Jan/2019:02:12:41 +0000] "POST /api/v1/prom/read?db=stress HTTP/1.1" 200 0 "-" "Go-http-client/1.1" b1e70417-2046-11e9-adda-0242ac110002 210 [panic:runtime error: invalid memory address or nil pointer dereference] goroutine 115603 [running]:
runtime/debug.Stack(0xc007d17940, 0xc000136400, 0xbf0aba0663ddd0b5)
	/usr/local/go/src/runtime/debug/stack.go:24 +0xa7
github.com/influxdata/influxdb/services/httpd.(*Handler).recovery.func1.1(0xc007d17940, 0xc000136400, 0xbf0aba0663ddd0b5, 0x3bc17c2d0e06, 0x1d25320, 0xc000250e00, 0x135fe00, 0xc00013a2a0)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1690 +0xcd
panic(0x1065c20, 0x1cfcbb0)
	/usr/local/go/src/runtime/panic.go:513 +0x1b9
github.com/influxdata/influxdb/services/httpd.(*Handler).servePromRead(0xc000250e00, 0x7fa68e304898, 0xc007d17980, 0xc000136400, 0x0, 0x0)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1065 +0x270
github.com/influxdata/influxdb/services/httpd.(*Handler).servePromRead-fm(0x7fa68e304898, 0xc007d17980, 0xc000136400, 0x0, 0x0)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:183 +0x5c
github.com/influxdata/influxdb/services/httpd.authenticate.func1(0x7fa68e304898, 0xc007d17980, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1487 +0x788
net/http.HandlerFunc.ServeHTTP(0xc00001ee20, 0x7fa68e304898, 0xc007d17980, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.(*Handler).responseWriter.func1(0x135bb40, 0xc007d17960, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1667 +0xab
net/http.HandlerFunc.ServeHTTP(0xc00001ee40, 0x135bb40, 0xc007d17960, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.gzipFilter.func1(0x135bb40, 0xc007d17960, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/gzip.go:23 +0x29d
net/http.HandlerFunc.ServeHTTP(0xc00001ee60, 0x135bb40, 0xc007d17960, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.cors.func1(0x135bb40, 0xc007d17960, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1609 +0xf3
net/http.HandlerFunc.ServeHTTP(0xc00001ee80, 0x135bb40, 0xc007d17960, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.requestID.func1(0x135bb40, 0xc007d17960, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1640 +0x179
net/http.HandlerFunc.ServeHTTP(0xc00001eea0, 0x135bb40, 0xc007d17960, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.(*Handler).logging.func1(0x135bb40, 0xc007d17940, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1648 +0xdc
net/http.HandlerFunc.ServeHTTP(0xc00001eec0, 0x135bb40, 0xc007d17940, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/services/httpd.(*Handler).recovery.func1(0x135fe00, 0xc00013a2a0, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:1704 +0x147
net/http.HandlerFunc.ServeHTTP(0xc00001eee0, 0x135fe00, 0xc00013a2a0, 0xc000136400)
	/usr/local/go/src/net/http/server.go:1964 +0x44
github.com/influxdata/influxdb/vendor/github.com/bmizerany/pat.(*PatternServeMux).ServeHTTP(0xc00001e8e0, 0x135fe00, 0xc00013a2a0, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/vendor/github.com/bmizerany/pat/mux.go:117 +0x155
github.com/influxdata/influxdb/services/httpd.(*Handler).ServeHTTP(0xc000250e00, 0x135fe00, 0xc00013a2a0, 0xc000136400)
	/go/src/github.com/influxdata/influxdb/services/httpd/handler.go:381 +0x247
net/http.serverHandler.ServeHTTP(0xc00033a000, 0x135fe00, 0xc00013a2a0, 0xc000136400)
	/usr/local/go/src/net/http/server.go:2741 +0xab
net/http.(*conn).serve(0xc000830000, 0x13616c0, 0xc00002cb80)
	/usr/local/go/src/net/http/server.go:1847 +0x646
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:2851 +0x2f5

But when I switch to prometheus 2.6.0, I find it works well. So is there some bug fix in prometheus 2.6?

@aixeshunter

This comment has been minimized.

Copy link
Contributor

aixeshunter commented Jan 25, 2019

@yeya24 What is the influxdb version?

@yeya24

This comment has been minimized.

Copy link
Author

yeya24 commented Jan 25, 2019

@yeya24 What is the influxdb version?

influxdb v1.7.2

@codesome

This comment has been minimized.

Copy link
Member

codesome commented Jan 25, 2019

@yeya24 I see that your issue is fixed with the latest Prometheus version, so I am closing this issue for now. If you face any more problem, feel free to open it again,

@codesome codesome closed this Jan 25, 2019

@yeya24

This comment has been minimized.

Copy link
Author

yeya24 commented Jan 25, 2019

@yeya24 I see that your issue is fixed with the latest Prometheus version, so I am closing this issue for now. If you face any more problem, feel free to open it again,

@codesome Sure, it's OK.

@cstyan

This comment has been minimized.

Copy link
Contributor

cstyan commented Jan 25, 2019

#4832 this looks to be the only remote write change in 2.6

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.