Skip to content
This repository has been archived by the owner on Apr 2, 2024. It is now read-only.

runtime error: invalid memory address or nil pointer dereference PromQL API #192

Closed
dtoddonx opened this issue Aug 13, 2020 · 4 comments
Closed
Labels
Bug Something isn't working

Comments

@dtoddonx
Copy link

When using the adapter directly in Grafana and doing a PromQL query such as:

sum (container_memory_working_set_bytes{project="production",region="us-central1",kubernetes_io_hostname=~"(.+)-np1-(.+)"}) / sum (machine_memory_bytes{project="production",region="us-central1",kubernetes_io_hostname=~"(.+)-np1-(.+)"}) * 100

The following error is generated in the adapter:

{"caller":"series_set.go:82","err":"query returned wrong number of labels: 4, 1","level":"error","ts":"2020-08-13T19:45:42.364Z"} {"caller":"series_set.go:82","err":"query returned wrong number of labels: 4, 1","level":"error","ts":"2020-08-13T19:45:42.366Z"} {"caller":"panic.go:967","err":"runtime error: invalid memory address or nil pointer dereference","level":"error","msg":"runtime panic in parser","stacktrace":"goroutine 6572 [running]:\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*evaluator).recover(0xc004346a80, 0xc0544f5060)\n\t/go/timescale-prometheus/pkg/promql/engine.go:860 +0xd8\npanic(0xcf6720, 0x1590900)\n\t/usr/local/go/src/runtime/panic.go:967 +0x166\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*evaluator).eval(0xc004346a80, 0xf53ce0, 0xc005a63c00, 0xc0544f5028, 0x40cf28)\n\t/go/timescale-prometheus/pkg/promql/engine.go:1376 +0x48d\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*evaluator).Eval(0xc004346a80, 0xf53ce0, 0xc005a63c00, 0x0, 0x0, 0x0, 0x0)\n\t/go/timescale-prometheus/pkg/promql/engine.go:871 +0x88\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*Engine).execEvalStmt(0xc00002e420, 0xf53820, 0xc053d52bd0, 0xc01ccdf0e0, 0xc0549a3c70, 0x0, 0x0, 0x0, 0x0, 0x0, ...)\n\t/go/timescale-prometheus/pkg/promql/engine.go:621 +0x1081\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*Engine).exec(0xc00002e420, 0xf53820, 0xc053d52bd0, 0xc01ccdf0e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)\n\t/go/timescale-prometheus/pkg/promql/engine.go:517 +0x5b7\ngithub.com/timescale/timescale-prometheus/pkg/promql.(*query).Exec(0xc01ccdf0e0, 0xf53760, 0xc0221e6a40, 0xc0540c2c80)\n\t/go/timescale-prometheus/pkg/promql/engine.go:214 +0x94\ngithub.com/timescale/timescale-prometheus/pkg/api.QueryRange.func1(0xf4c4a0, 0xc005a63b90, 0xc023ad3e00)\n\t/go/timescale-prometheus/pkg/api/query_range.go:85 +0xb9e\nnet/http.HandlerFunc.ServeHTTP(0xc000426540, 0xf4c4a0, 0xc005a63b90, 0xc023ad3e00)\n\t/usr/local/go/src/net/http/server.go:2012 +0x44\ngithub.com/NYTimes/gziphandler.GzipHandlerWithOpts.func1.1(0xf4c0a0, 0xc02af71500, 0xc023ad3e00)\n\t/go/pkg/mod/github.com/!n!y!times/gziphandler@v1.1.1/gzip.go:336 +0x211\nnet/http.HandlerFunc.ServeHTTP(0xc0002e2570, 0xf4c0a0, 0xc02af71500, 0xc023ad3e00)\n\t/usr/local/go/src/net/http/server.go:2012 +0x44\nmain.timeHandler.func1(0xf4c0a0, 0xc02af71500, 0xc023ad3e00)\n\t/go/timescale-prometheus/cmd/timescale-prometheus/main.go:386 +0xc5\ngithub.com/prometheus/common/route.(*Router).handle.func1(0xf4c0a0, 0xc02af71500, 0xc023ad3d00, 0x0, 0x0, 0x0)\n\t/go/pkg/mod/github.com/prometheus/common@v0.9.1/route/route.go:83 +0x27f\ngithub.com/julienschmidt/httprouter.(*Router).ServeHTTP(0xc000099aa0, 0xf4c0a0, 0xc02af71500, 0xc023ad3d00)\n\t/go/pkg/mod/github.com/julienschmidt/httprouter@v1.3.0/router.go:387 +0xc37\ngithub.com/prometheus/common/route.(*Router).ServeHTTP(0xc00000ce80, 0xf4c0a0, 0xc02af71500, 0xc023ad3d00)\n\t/go/pkg/mod/github.com/prometheus/common@v0.9.1/route/route.go:121 +0x4c\nnet/http.(*ServeMux).ServeHTTP(0xc000286a00, 0xf4c0a0, 0xc02af71500, 0xc023ad3d00)\n\t/usr/local/go/src/net/http/server.go:2387 +0x1a5\nnet/http.serverHandler.ServeHTTP(0xc0002880e0, 0xf4c0a0, 0xc02af71500, 0xc023ad3d00)\n\t/usr/local/go/src/net/http/server.go:2807 +0xa3\nnet/http.(*conn).serve(0xc0140b5b80, 0xf53760, 0xc0131289c0)\n\t/usr/local/go/src/net/http/server.go:1895 +0x86c\ncreated by net/http.(*Server).Serve\n\t/usr/local/go/src/net/http/server.go:2933 +0x35c\n","ts":"2020-08-13T19:45:42.366Z"} {"caller":"query_range.go:87","endpoint":"query_range","level":"error","msg":"unexpected error: runtime error: invalid memory address or nil pointer dereference","ts":"2020-08-13T19:45:42.366Z"}

The 'wrong number of labels' error is a persistent error when doing queries either through prometheus using remote_read or through the adapter directly; however the runtime error only occurs when grafana queries the adapter directly

@JLockerman
Copy link
Contributor

likely fixed by PR #184

@JLockerman JLockerman added Bug Something isn't working priority/sev1 labels Aug 17, 2020
@cevian
Copy link
Contributor

cevian commented Aug 19, 2020

@dtoddonx This should be resolved in our new release 0.1.0-beta.2. Can you please try it and let us know if this solves the problem?

@dtoddonx
Copy link
Author

dtoddonx commented Sep 2, 2020

Sorry for the slow turnaround on this, yes I no longer receive the crash when doing the above query. Thanks!

@Harkishen-Singh
Copy link
Member

That sounds like a perfect fix. Closing this issue in that case. Feel free to open if it occurs anytime in future.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants