Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

runtime error: index out of range #2249

Closed
sjoerdmulder opened this Issue Dec 5, 2016 · 7 comments

Comments

Projects
None yet
5 participants
@sjoerdmulder
Copy link

sjoerdmulder commented Dec 5, 2016

What did you do?
Nothing, abrupt service panic
What did you expect to see?
That it continued working.
What did you see instead? Under which circumstances?
Last night at 22:48 our Prometheus service stopped ingesting data and serving requests
Environment

  • System information:
    Docker v.1.12.1
    Linux 3.13.0-96-generic x86_64
  • Prometheus version:
    v1.3.1
  • Logs:
time="2016-12-04T21:48:01Z" level=warning msg="Storage has entered rushed mode." chunksToPersist=149515 maxChunksToPersist=524288 maxMemoryChunks=500000 memoryChunks=544659 source="storage.go:1607" urgencyScore=0.8931800000000001
time="2016-12-04T21:48:01Z" level=info msg="Storage has left rushed mode." chunksToPersist=149466 maxChunksToPersist=524288 maxMemoryChunks=500000 memoryChunks=469612 source="storage.go:1594" urgencyScore=0.2850837707519531
time="2016-12-04T21:48:14Z" level=warning msg="Storage has entered rushed mode." chunksToPersist=149372 maxChunksToPersist=524288 maxMemoryChunks=500000 memoryChunks=541605 source="storage.go:1607" urgencyScore=0.8321000000000001
time="2016-12-04T21:48:15Z" level=error msg="Storage needs throttling. Scrapes and rule evaluations will be skipped." chunksToPersist=149017 maxChunksToPersist=524288 maxToleratedMemChunks=550000 memoryChunks=584381 source="storage.go:908"
time="2016-12-04T21:48:15Z" level=info msg="Storage has left rushed mode." chunksToPersist=148975 maxChunksToPersist=524288 maxMemoryChunks=500000 memoryChunks=529466 source="storage.go:1594" urgencyScore=0.5893199999999998
time="2016-12-04T21:48:15Z" level=warning msg="Storage has entered rushed mode." chunksToPersist=148957 maxChunksToPersist=524288 maxMemoryChunks=500000 memoryChunks=540245 source="storage.go:1607" urgencyScore=0.8048999999999995
time="2016-12-04T21:48:19Z" level=error msg="
http: panic serving 172.16.5.1:58461: runtime error: index out of range
goroutine 49313231 [running]:
net/http.(*conn).serve.func1(0xc4852efb00)
	/usr/local/go/src/net/http/server.go:1491 +0x12a
panic(0x1785e00, 0xc4200140c0)
	/usr/local/go/src/runtime/panic.go:458 +0x243
github.com/prometheus/prometheus/storage/local.(*memorySeries).preloadChunksForRange(0xc4c84b6af0, 0xe6f63760970a017f, 0x158314b8270, 0x158cbcede50, 0xc420266140, 0x2469c40, 0xc48f9a46c0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/storage/local/series.go:468 +0x73d
github.com/prometheus/prometheus/storage/local.(*MemorySeriesStorage).preloadChunksForRange(0xc420266140, 0xe6f63760970a017f, 0xc4c84b6af0, 0x158314b8270, 0x158cbcede50, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/storage/local/storage.go:992 +0xd1
github.com/prometheus/prometheus/storage/local.(*MemorySeriesStorage).QueryRange(0xc420266140, 0x7fc433d230d0, 0xc4903f0c60, 0x158314b8270, 0x158cbcede50, 0xc4a1e71fa0, 0x2, 0x2, 0x6, 0x6, ...)
	/go/src/github.com/prometheus/prometheus/storage/local/storage.go:506 +0x154
github.com/prometheus/prometheus/storage/local.memorySeriesStorageQuerier.QueryRange(0xc420266140, 0x7fc433d230d0, 0xc4903f0c60, 0x158314b8270, 0x158cbcede50, 0xc4a1e71fa0, 0x2, 0x2, 0x50, 0x48, ...)
	<autogenerated>:66 +0xa0
github.com/prometheus/prometheus/promql.(*Engine).populateIterators.func1(0x7fc435d3eb90, 0xc443e31f90, 0x4125ff)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:494 +0x41f
github.com/prometheus/prometheus/promql.inspector.Visit(0xc43e3a46c0, 0x7fc435d3eb90, 0xc443e31f90, 0xeee5679e69, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:306 +0x3a
github.com/prometheus/prometheus/promql.Walk(0x245b040, 0xc43e3a46c0, 0x7fc435d3eb90, 0xc443e31f90)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:255 +0x7b
github.com/prometheus/prometheus/promql.Walk(0x245b040, 0xc43e3a46c0, 0x7fc435d3ecf0, 0xc4433bc050)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:278 +0x51d
github.com/prometheus/prometheus/promql.Walk(0x245b040, 0xc43e3a46c0, 0x245afc0, 0xc4a4406140)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:275 +0x234
github.com/prometheus/prometheus/promql.Walk(0x245b040, 0xc43e3a46c0, 0x7fc435d3ec98, 0xc4a4406120)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:285 +0x74a
github.com/prometheus/prometheus/promql.Inspect(0x7fc435d3ec98, 0xc4a4406120, 0xc43e3a46c0)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:316 +0x4b
github.com/prometheus/prometheus/promql.(*Engine).populateIterators(0xc420461a40, 0x7fc433d230d0, 0xc4903f0c60, 0x246e080, 0xc420266140, 0xc490b244e0, 0x0, 0x2)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:511 +0x123
github.com/prometheus/prometheus/promql.(*Engine).execEvalStmt(0xc420461a40, 0x7fc433d230d0, 0xc4903f0c60, 0xc43e3a4240, 0xc490b244e0, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:366 +0x1b6
github.com/prometheus/prometheus/promql.(*Engine).exec(0xc420461a40, 0x7fc433d230d0, 0xc4903f0c60, 0xc43e3a4240, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:349 +0x3a9
github.com/prometheus/prometheus/promql.(*query).Exec(0xc43e3a4240, 0x7fc435b7cb38, 0xc43e3be740, 0x15831501650)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:196 +0x52
github.com/prometheus/prometheus/web/api/v1.(*API).queryRange(0xc4203f3b30, 0xc4a3f7f680, 0x8000104, 0x0, 0xffffffffffffffff)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:212 +0x601
github.com/prometheus/prometheus/web/api/v1.(*API).(github.com/prometheus/prometheus/web/api/v1.queryRange)-fm(0xc4a3f7f680, 0xc4a44060c0, 0x4, 0x0)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:126 +0x34
github.com/prometheus/prometheus/web/api/v1.(*API).Register.func1.1(0x24620c0, 0xc4a44060c0, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:110 +0x55
net/http.HandlerFunc.ServeHTTP(0xc42033ddd0, 0x24620c0, 0xc4a44060c0, 0xc4a3f7f680)
	/usr/local/go/src/net/http/server.go:1726 +0x44
github.com/prometheus/prometheus/util/httputil.CompressionHandler.ServeHTTP(0x245c440, 0xc42033ddd0, 0x2462480, 0xc4758b3720, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/util/httputil/compression.go:90 +0x7c
github.com/prometheus/prometheus/util/httputil.(*CompressionHandler).ServeHTTP(0xc42033dde0, 0x2462480, 0xc4758b3720, 0xc4a3f7f680)
	<autogenerated>:5 +0x79
net/http.(Handler).ServeHTTP-fm(0x2462480, 0xc4758b3720, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/web/web.go:173 +0x4d
github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus.InstrumentHandlerFuncWithOpts.func1(0x2468240, 0xc42160a270, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/http.go:287 +0x2ab
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).handle.func1(0x2468240, 0xc42160a270, 0xc4a3f7f680, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:83 +0x2ba
github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter.(*Router).ServeHTTP(0xc420391140, 0x2468240, 0xc42160a270, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter/router.go:299 +0x7ec
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).ServeHTTP(0xc420461b00, 0x2468240, 0xc42160a270, 0xc4a3f7f680)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:125 +0x4c
net/http.serverHandler.ServeHTTP(0xc462c61700, 0x2468240, 0xc42160a270, 0xc4a3f7f680)
	/usr/local/go/src/net/http/server.go:2202 +0x7d
net/http.(*conn).serve(0xc4852efb00, 0x2469940, 0xc43e699ec0)
	/usr/local/go/src/net/http/server.go:1579 +0x4b7
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:2293 +0x44d
" source="<autogenerated>:38"

@sjoerdmulder sjoerdmulder changed the title Http endpoint prometheus died runtime error: index out of range Dec 5, 2016

@fabxc

This comment has been minimized.

Copy link
Member

fabxc commented Dec 5, 2016

@beorn7 seems to be in the storage.

@beorn7

This comment has been minimized.

Copy link
Member

beorn7 commented Dec 5, 2016

Yeah, weird, that looks like something that has got a guard in 1.2.something. I'll have a closer look ASAP.

@rtreffer

This comment has been minimized.

Copy link

rtreffer commented Dec 9, 2016

@beorn7 we are now hitting that in production, too.

@seiffert

This comment has been minimized.

Copy link

seiffert commented Dec 11, 2016

Same here:

Version: 1.4.0
OS: CoreOS 1185.3.0

time="2016-12-11T12:37:06Z" level=error msg="http: panic serving 127.0.0.1:36536: runtime error: index out of range
goroutine 49900570 [running]:
net/http.(*conn).serve.func1(0xc55dc8b880)
	/usr/local/go/src/net/http/server.go:1491 +0x12a
panic(0x18939a0, 0xc420010070)
	/usr/local/go/src/runtime/panic.go:458 +0x243
github.com/prometheus/prometheus/storage/local.(*memorySeries).preloadChunksForRange(0xc5dac142a0, 0x1335fbc7873103b9, 0x158e8b7f600, 0x158ede2e5e0, 0xc4200d4000, 0x2664340, 0xc94b970060, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/storage/local/series.go:468 +0x73d
github.com/prometheus/prometheus/storage/local.(*MemorySeriesStorage).preloadChunksForRange(0xc4200d4000, 0x1335fbc7873103b9, 0xc5dac142a0, 0x158e8b7f600, 0x158ede2e5e0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/storage/local/storage.go:992 +0xd1
github.com/prometheus/prometheus/storage/local.(*MemorySeriesStorage).QueryRange(0xc4200d4000, 0x7f996b2b4528, 0xc944dc2660, 0x158e8b7f600, 0x158ede2e5e0, 0xc4bac669e0, 0x2, 0x2, 0x0, 0x1ad5bd7, ...)
	/go/src/github.com/prometheus/prometheus/storage/local/storage.go:506 +0x154
github.com/prometheus/prometheus/storage/local.memorySeriesStorageQuerier.QueryRange(0xc4200d4000, 0x7f996b2b4528, 0xc944dc2660, 0x158e8b7f600, 0x158ede2e5e0, 0xc4bac669e0, 0x2, 0x2, 0x0, 0x0, ...)
	<autogenerated>:66 +0xa0
github.com/prometheus/prometheus/promql.(*Engine).populateIterators.func1(0x7f996b3c1190, 0xc52d948e60, 0x412aff)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:506 +0x41f
github.com/prometheus/prometheus/promql.inspector.Visit(0xc4a0c67e40, 0x7f996b3c1190, 0xc52d948e60, 0xeee5679e69, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:306 +0x3a
github.com/prometheus/prometheus/promql.Walk(0x26551c0, 0xc4a0c67e40, 0x7f996b3c1190, 0xc52d948e60)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:255 +0x7b
github.com/prometheus/prometheus/promql.Walk(0x26551c0, 0xc4a0c67e40, 0x7f996b3c1210, 0xc4edb29940)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:281 +0xa25
github.com/prometheus/prometheus/promql.Walk(0x26551c0, 0xc4a0c67e40, 0x7f996b3c1138, 0xc5283fa5f0)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:278 +0x51d
github.com/prometheus/prometheus/promql.Inspect(0x7f996b3c1138, 0xc5283fa5f0, 0xc4a0c67e40)
	/go/src/github.com/prometheus/prometheus/promql/ast.go:316 +0x4b
github.com/prometheus/prometheus/promql.(*Engine).populateIterators(0xc4203e75c0, 0x7f996b2b4528, 0xc944dc2660, 0x26687e0, 0xc4200d4000, 0xc46d2698c0, 0x4172b6, 0xc9040033c0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:523 +0x123
github.com/prometheus/prometheus/promql.(*Engine).execEvalStmt(0xc4203e75c0, 0x7f996b2b4528, 0xc944dc2660, 0xc4edb29980, 0xc46d2698c0, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:378 +0x1b6
github.com/prometheus/prometheus/promql.(*Engine).exec(0xc4203e75c0, 0x7f996b2b4528, 0xc944dc2660, 0xc4edb29980, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:361 +0x3a9
github.com/prometheus/prometheus/promql.(*query).Exec(0xc4edb29980, 0x7f996b4114c8, 0xc49a25c300, 0x158e8bc89e0)
	/go/src/github.com/prometheus/prometheus/promql/engine.go:208 +0x52
github.com/prometheus/prometheus/web/api/v1.(*API).queryRange(0xc42041f2c0, 0xc5fd211ef0, 0x7000105, 0x0, 0xffffffffffffffff)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:212 +0x601
github.com/prometheus/prometheus/web/api/v1.(*API).(github.com/prometheus/prometheus/web/api/v1.queryRange)-fm(0xc5fd211ef0, 0xc53a7c7be0, 0x4, 0x0)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:126 +0x34
github.com/prometheus/prometheus/web/api/v1.(*API).Register.func1.1(0x265c740, 0xc53a7c7be0, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/web/api/v1/api.go:110 +0x55
net/http.HandlerFunc.ServeHTTP(0xc4204f1880, 0x265c740, 0xc53a7c7be0, 0xc5fd211ef0)
	/usr/local/go/src/net/http/server.go:1726 +0x44
github.com/prometheus/prometheus/util/httputil.CompressionHandler.ServeHTTP(0x2656900, 0xc4204f1880, 0x265cb80, 0xc5a84f08a0, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/util/httputil/compression.go:90 +0x7c
github.com/prometheus/prometheus/util/httputil.(*CompressionHandler).ServeHTTP(0xc4204f1890, 0x265cb80, 0xc5a84f08a0, 0xc5fd211ef0)
	<autogenerated>:5 +0x79
net/http.(Handler).ServeHTTP-fm(0x265cb80, 0xc5a84f08a0, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/web/web.go:177 +0x4d
github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus.InstrumentHandlerFuncWithOpts.func1(0x2662940, 0xc8a5b57c70, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/client_golang/prometheus/http.go:287 +0x2ab
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).handle.func1(0x2662940, 0xc8a5b57c70, 0xc5fd211ef0, 0x0, 0x0, 0x0)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:83 +0x2ba
github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter.(*Router).ServeHTTP(0xc42053ad00, 0x2662940, 0xc8a5b57c70, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/julienschmidt/httprouter/router.go:299 +0x7ec
github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route.(*Router).ServeHTTP(0xc4203e7680, 0x2662940, 0xc8a5b57c70, 0xc5fd211ef0)
	/go/src/github.com/prometheus/prometheus/vendor/github.com/prometheus/common/route/route.go:125 +0x4c
net/http.serverHandler.ServeHTTP(0xc420452000, 0x2662940, 0xc8a5b57c70, 0xc5fd211ef0)
	/usr/local/go/src/net/http/server.go:2202 +0x7d
net/http.(*conn).serve(0xc55dc8b880, 0x2664040, 0xc4ef8d8bc0)
	/usr/local/go/src/net/http/server.go:1579 +0x4b7
created by net/http.(*Server).Serve
	/usr/local/go/src/net/http/server.go:2293 +0x44d
" source="<autogenerated>:38"

@beorn7 beorn7 added the priority/P0 label Dec 13, 2016

@beorn7

This comment has been minimized.

Copy link
Member

beorn7 commented Dec 13, 2016

Will look into this ASAP.

@beorn7

This comment has been minimized.

Copy link
Member

beorn7 commented Dec 13, 2016

This can only happen with corrupted series data, most likely if a file has disappeared. Fix on its way.

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 24, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 24, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.