Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

web: remote read: runtime error: invalid memory address or nil pointer dereference #4205

Closed
y2sarakawa opened this Issue May 30, 2018 · 9 comments

Comments

Projects
None yet
5 participants
@y2sarakawa
Copy link

y2sarakawa commented May 30, 2018

What happened

Prometheus show this error message.

caller=stdlib.go:89 component=web caller="http: panic serving b.b.b.b:p" msg="runtime error: invalid memory address or nil pointer dereference"

Error location

StepMs: p.Step,

The parameter p may be passed nil. Therefore, it panic may occur.

I could not judge whether caller's problem or this function's problem.

Environment

  • Prometheus version
    • Dockerhub prom/prometheus:master
    • version=2.2.1, branch=master, revision=56e6679281ef0492d1a5b1c2bc01cf199180bbb5
@tomwilkie

This comment has been minimized.

Copy link
Member

tomwilkie commented Jun 4, 2018

@y2sarakawa can you post the full stack trace please? Its looks like Select can be called with nil first param in a few cases, just trying to figure out if thats valid of not.

@y2sarakawa

This comment has been minimized.

Copy link
Author

y2sarakawa commented Jun 5, 2018

It seems that the panic is recovered in Go 's "net/http" package.

In order to get the stack trace I modified "stdlib.go"

  • Environment
[vagrant@localhost prometheus]$ git branch
* master
[vagrant@localhost prometheus]$ git rev-parse --short HEAD
0e71123
[vagrant@localhost prometheus]$ git diff
diff --git a/vendor/github.com/go-kit/kit/log/stdlib.go b/vendor/github.com/go-kit/kit/log/stdlib.go
index ff96b5d..6150123 100644
--- a/vendor/github.com/go-kit/kit/log/stdlib.go
+++ b/vendor/github.com/go-kit/kit/log/stdlib.go
@@ -4,6 +4,7 @@ import (
        "io"
        "log"
        "regexp"
+       "runtime/debug"
        "strings"
 )
 
@@ -65,6 +66,7 @@ func NewStdlibAdapter(logger Logger, options ...StdlibAdapterOption) io.Writer {
 }
 
 func (a StdlibAdapter) Write(p []byte) (int, error) {
+       defer debug.PrintStack()
        result := subexps(p)
        keyvals := []interface{}{}
        var timestamp string
[vagrant@localhost prometheus]$ promu build prometheus
 >   prometheus
[vagrant@localhost prometheus]$ cat prometheus.yml
scrape_configs:
- job_name: 'prometheus'
  metrics_path: '/metrics'
  static_configs: [ targets: ['127.0.0.1:9090'] ]
remote_read:
  - url: http://10.0.2.2:8086/api/v1/prom/read?db=prometheus
#  - url: file:///dev/null
  • log
[vagrant@localhost prometheus]$ ./prometheus\
>  --config.file=prometheus.yml\
>  --storage.tsdb.path=data/\
>  --web.console.libraries=console_libraries\
>  --web.console.templates=consoles
level=info ts=2018-06-04T16:31:01.241110023Z caller=main.go:222 msg="Starting Prometheus" version="(version=2.2.1, branch=master, revision=0e7112330ef6cb7f89cfa14513fb34cb25dfc728)"
level=info ts=2018-06-04T16:31:01.241169286Z caller=main.go:223 build_context="(go=go1.10.2, user=vagrant@localhost.localdomain, date=20180604-16:29:25)"
level=info ts=2018-06-04T16:31:01.241184508Z caller=main.go:224 host_details="(Linux 3.10.0-693.17.1.el7.x86_64 #1 SMP Thu Jan 25 20:13:58 UTC 2018 x86_64 localhost.localdomain (none))"
level=info ts=2018-06-04T16:31:01.241197696Z caller=main.go:225 fd_limits="(soft=1024, hard=4096)"
level=info ts=2018-06-04T16:31:01.241953914Z caller=main.go:514 msg="Starting TSDB ..."
level=info ts=2018-06-04T16:31:01.242024573Z caller=web.go:426 component=web msg="Start listening for connections" address=0.0.0.0:9090
level=info ts=2018-06-04T16:31:01.250042071Z caller=main.go:524 msg="TSDB started"
level=info ts=2018-06-04T16:31:01.250073589Z caller=main.go:603 msg="Loading configuration file" filename=prometheus.yml
level=info ts=2018-06-04T16:31:01.250606548Z caller=main.go:500 msg="Server is ready to receive web requests."
level=error ts=2018-06-04T16:31:07.166067083Z caller=stdlib.go:91 component=web caller="http: panic serving [::1" msg="]:53166: runtime error: invalid memory address or nil pointer dereference"
goroutine 158 [running]:
runtime/debug.Stack(0xc420876da0, 0x10, 0x8)
	/usr/local/go/src/runtime/debug/stack.go:24 +0xa7
runtime/debug.PrintStack()
	/usr/local/go/src/runtime/debug/stack.go:16 +0x22
github.com/prometheus/prometheus/vendor/github.com/go-kit/kit/log.StdlibAdapter.Write(0x1dd31a0, 0xc4202d1890, 0x1c6d925, 0x2, 0x1c705bb, 0x6, 0x1c6df08, 0x3, 0xc420b0c800, 0x1677, ...)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/go-kit/kit/log/stdlib.go:94 +0x31e
log.(*Logger).Output(0xc4201e1630, 0x2, 0xc420b0b000, 0x1677, 0x0, 0x0)
	/usr/local/go/src/log/log.go:172 +0x1fd
log.(*Logger).Printf(0xc4201e1630, 0x1c90f2f, 0x1d, 0xc420877098, 0x3, 0x3)
	/usr/local/go/src/log/log.go:179 +0x7e
net/http.(*Server).logf(0xc4204b3d40, 0x1c90f2f, 0x1d, 0xc420877098, 0x3, 0x3)
	/usr/local/go/src/net/http/server.go:2921 +0x6a
net/http.(*conn).serve.func1(0xc4204ca640)
	/usr/local/go/src/net/http/server.go:1727 +0x1d3
panic(0x1997220, 0x2ae6be0)
	/usr/local/go/src/runtime/panic.go:502 +0x229
github.com/prometheus/prometheus/storage/remote.ToQuery(0x163a796c8f8, 0x163cba34cf8, 0xc4201e2440, 0x1, 0x1, 0x0, 0x0, 0x0, 0x0)
	/home/vagrant/go/src/github.com/prometheus/prometheus/storage/remote/codec.go:96 +0x95
github.com/prometheus/prometheus/storage/remote.(*querier).Select(0xc420a9bd70, 0x0, 0xc4201e2440, 0x1, 0x1, 0x1, 0xc4201e2440, 0x1, 0x1)
	/home/vagrant/go/src/github.com/prometheus/prometheus/storage/remote/read.go:47 +0x64
github.com/prometheus/prometheus/storage/remote.externalLabelsQuerier.Select(0x1de2860, 0xc420a9bd70, 0x0, 0x0, 0xc4201e2440, 0x1, 0x1, 0x1dec820, 0xc420350e20, 0x0, ...)
	/home/vagrant/go/src/github.com/prometheus/prometheus/storage/remote/read.go:96 +0xbc
github.com/prometheus/prometheus/storage.(*mergeQuerier).Select(0xc420aa6ca0, 0x0, 0xc4201e2440, 0x1, 0x1, 0x1de2760, 0xc420aa6ca0, 0x0, 0x0)
	/home/vagrant/go/src/github.com/prometheus/prometheus/storage/fanout.go:222 +0x10f
github.com/prometheus/prometheus/web/api/v1.(*API).series(0xc420275180, 0xc420522f00, 0x0, 0x0, 0x0, 0x0)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:409 +0x7a3
github.com/prometheus/prometheus/web/api/v1.(*API).(github.com/prometheus/prometheus/web/api/v1.series)-fm(0xc420522f00, 0xc420aa6ba0, 0x7faec5788dc8, 0xc4200bab90, 0x1)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:185 +0x34
github.com/prometheus/prometheus/web/api/v1.(*API).Register.func1.1(0x1de2920, 0xc420aa6ba0, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:159 +0x55
net/http.HandlerFunc.ServeHTTP(0xc4202f0910, 0x1de2920, 0xc420aa6ba0, 0xc420522f00)
	/usr/local/go/src/net/http/server.go:1947 +0x44
github.com/prometheus/prometheus/util/httputil.CompressionHandler.ServeHTTP(0x1dd82e0, 0xc4202f0910, 0x7faec5788d98, 0xc4200bab90, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/util/httputil/compression.go:90 +0x7c
github.com/prometheus/prometheus/util/httputil.(CompressionHandler).ServeHTTP-fm(0x7faec5788d98, 0xc4200bab90, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/web.go:273 +0x57
github.com/prometheus/prometheus/web.(*Handler).testReady.func1(0x7faec5788d98, 0xc4200bab90, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/web.go:401 +0x55
net/http.HandlerFunc.ServeHTTP(0xc42000b2e0, 0x7faec5788d98, 0xc4200bab90, 0xc420522f00)
	/usr/local/go/src/net/http/server.go:1947 +0x44
github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerResponseSize.func1(0x1de4b60, 0xc420aa6b80, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/promhttp/instrument_server.go:196 +0xed
net/http.HandlerFunc.ServeHTTP(0xc4202d0660, 0x1de4b60, 0xc420aa6b80, 0xc420522f00)
	/usr/local/go/src/net/http/server.go:1947 +0x44
github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/promhttp.InstrumentHandlerDuration.func2(0x1de4b60, 0xc420aa6b80, 0xc420522f00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/promhttp/instrument_server.go:76 +0xb5
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).handle.func1(0x1de4b60, 0xc420aa6b80, 0xc420522d00, 0x0, 0x0, 0x0)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:60 +0x222
github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter.(*Router).ServeHTTP(0xc42048ddc0, 0x1de4b60, 0xc420aa6b80, 0xc420522d00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter/router.go:299 +0x6d1
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).ServeHTTP(0xc42000a880, 0x1de4b60, 0xc420aa6b80, 0xc420522d00)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:98 +0x4c
net/http.StripPrefix.func1(0x1de4b60, 0xc420aa6b80, 0xc420522800)
	/usr/local/go/src/net/http/server.go:1986 +0x19a
net/http.HandlerFunc.ServeHTTP(0xc4202d1830, 0x1de4b60, 0xc420aa6b80, 0xc420522800)
	/usr/local/go/src/net/http/server.go:1947 +0x44
net/http.(*ServeMux).ServeHTTP(0xc420373920, 0x1de4b60, 0xc420aa6b80, 0xc420522800)
	/usr/local/go/src/net/http/server.go:2337 +0x130
github.com/prometheus/prometheus/web.secureHeadersMiddleware.func1(0x1de4b60, 0xc420aa6b80, 0xc420522800)
	/home/vagrant/go/src/github.com/prometheus/prometheus/web/web.go:81 +0x195
net/http.HandlerFunc.ServeHTTP(0xc42052a440, 0x1de4b60, 0xc420aa6b80, 0xc420522800)
	/usr/local/go/src/net/http/server.go:1947 +0x44
github.com/prometheus/prometheus/vendor/github.com/opentracing-contrib/go-stdlib/nethttp.Middleware.func2(0x1dec460, 0xc4204420e0, 0xc420522700)
	/home/vagrant/go/src/github.com/prometheus/prometheus/vendor/github.com/opentracing-contrib/go-stdlib/nethttp/server.go:74 +0x3ab
net/http.HandlerFunc.ServeHTTP(0xc4202d18c0, 0x1dec460, 0xc4204420e0, 0xc420522700)
	/usr/local/go/src/net/http/server.go:1947 +0x44
net/http.serverHandler.ServeHTTP(0xc4204b3d40, 0x1dec460, 0xc4204420e0, 0xc420522700)
	/usr/local/go/src/net/http/server.go:2694 +0xbc
net/http.(*conn).serve(0xc4204ca640, 0x1dee460, 0xc4200d2900)
	/usr/local/go/src/net/http/server.go:1830 +0x651
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:2795 +0x27b
^Clevel=warn ts=2018-06-04T16:31:21.655635112Z caller=main.go:377 msg="Received SIGTERM, exiting gracefully..."
level=info ts=2018-06-04T16:31:21.655719665Z caller=main.go:402 msg="Stopping scrape discovery manager..."
level=info ts=2018-06-04T16:31:21.655738737Z caller=main.go:416 msg="Stopping notify discovery manager..."
level=info ts=2018-06-04T16:31:21.655745947Z caller=main.go:438 msg="Stopping scrape manager..."
level=info ts=2018-06-04T16:31:21.655793289Z caller=main.go:412 msg="Notify discovery manager stopped"
level=info ts=2018-06-04T16:31:21.655806744Z caller=main.go:398 msg="Scrape discovery manager stopped"
level=info ts=2018-06-04T16:31:21.655852967Z caller=main.go:432 msg="Scrape manager stopped"
level=info ts=2018-06-04T16:31:21.656120183Z caller=manager.go:464 component="rule manager" msg="Stopping rule manager..."
level=info ts=2018-06-04T16:31:21.656140811Z caller=manager.go:470 component="rule manager" msg="Rule manager stopped"
level=info ts=2018-06-04T16:31:21.656151033Z caller=notifier.go:512 component=notifier msg="Stopping notification manager..."
level=info ts=2018-06-04T16:31:21.656160976Z caller=main.go:588 msg="Notifier manager stopped"
level=info ts=2018-06-04T16:31:21.656230598Z caller=main.go:599 msg="See you next time!"
[vagrant@localhost prometheus]$ 
  • Remote read
[vagrant@localhost prometheus]$ curl -i -g "http://localhost:9090/api/v1/series?match[]=up&start=$(expr $(date +%s) - 604800)&end=$(date +%s)"; echo
curl: (52) Empty reply from server

[vagrant@localhost prometheus]$ 
@tomwilkie

This comment has been minimized.

Copy link
Member

tomwilkie commented Jun 5, 2018

Thanks @y2sarakawa! looks like the panic is serving /series.

@simonpasquier

This comment has been minimized.

Copy link
Member

simonpasquier commented Jun 5, 2018

FYI I've submitted #4221 to get the stack traces in the logs.

@fabxc

This comment has been minimized.

Copy link
Member

fabxc commented Jun 12, 2018

This made it into 2.3 unfortunately, see #4240. Is there any solution for this yet? Would be great to add bug labels in the future so it doesn't slip through as easily.

@y2sarakawa

This comment has been minimized.

Copy link
Author

y2sarakawa commented Jun 14, 2018

Before #4185 (including v2.2.1), Prometheus does not dedupe properly. As a result, Grafana 5 draws Intermittent graph using irate().
After #4122 (including v2.3.0), Prometheus /series end points with remote reads (such as variables of Grafana 5 dashboards) do not respond correctly. As a result, Grafana's dashboards can not properly use query variables.
Finally, Users of Grafana 5 can not get a decent graph using remote_read together with recent versions.

I hope this problem will be fixed in v2.3.1.

@tomwilkie

This comment has been minimized.

Copy link
Member

tomwilkie commented Jun 18, 2018

Fixed in #4282

@tomwilkie tomwilkie closed this Jun 18, 2018

@y2sarakawa

This comment has been minimized.

Copy link
Author

y2sarakawa commented Jun 19, 2018

This issue has been fixed in v2.3.1. Thanks a lot!

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 22, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 22, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.