Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Crash due to nil pointer access CompactionProcessor #368

Closed
juliusv opened this Issue Oct 17, 2013 · 8 comments

Comments

Projects
None yet
3 participants
@juliusv
Copy link
Member

juliusv commented Oct 17, 2013

A set of Prometheus instances were crashing during compaction with this error:

2013-10-17_07:32:25.32607 I1017 07:32:25.325992 07227 curator.go:344] Curating 00483580969990100027-c-2-s...
2013-10-17_07:32:25.32732 I1017 07:32:25.327235 07227 curator.go:344] Curating 00492574032515006478-c-8-s...
2013-10-17_07:32:25.32931 I1017 07:32:25.329225 07227 curator.go:344] Curating 00507900210406699325-h-6-t...
2013-10-17_07:32:25.33052 panic: runtime error: invalid memory address or nil pointer dereference
2013-10-17_07:32:25.33062 [signal 0xb code=0x1 addr=0x30 pc=0x497677]
2013-10-17_07:32:25.33071
2013-10-17_07:32:25.33072 goroutine 30 [running]:
2013-10-17_07:32:25.33081 github.com/prometheus/prometheus/storage/metric.(*CompactionProcessor).Apply(0xc201f70bc0, 0xc200143780, 0xc202ca13c0, 0xc2023ae0c0, 0xc20013e060, ...)
2013-10-17_07:32:25.33112       /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/processor.go:232 +0x1567
2013-10-17_07:32:25.33168 github.com/prometheus/prometheus/storage/metric.(*watermarkScanner).Operate(0xc20279f8c0, 0x82d6e0, 0xc2042e9030, 0x7f0060, 0xc2042e0d80, ...)
2013-10-17_07:32:25.33207       /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/curator.go:416 +0x9fe
2013-10-17_07:32:25.33223 github.com/prometheus/prometheus/storage/raw/leveldb.(*LevelDBPersistence).ForEach(0xc20013e780, 0xc2035c3150, 0xc20279f8c0, 0xc2035c3180, 0xc20279f8c0, ...)
2013-10-17_07:32:25.33247       /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/raw/leveldb/leveldb.go:490 +0x323
2013-10-17_07:32:25.33270 github.com/prometheus/prometheus/storage/metric.(*LevelDBHighWatermarker).ForEach(0xc200000a80, 0xc2035c3150, 0xc20279f8c0, 0xc2035c3180, 0xc20279f8c0, ...)
2013-10-17_07:32:25.33296       /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/watermark.go:106 +0x6f
2013-10-17_07:32:25.33316 github.com/prometheus/prometheus/storage/metric.(*Curator).Run(0xc2026ff780, 0x34630b8a000, 0xec9f18988, 0xc2388b9250, 0xdec340, ...)
2013-10-17_07:32:25.33337       /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/curator.go:212 +0x767
2013-10-17_07:32:25.33359 main.(*prometheus).compact(0xc201a22690, 0x34630b8a000, 0x1388, 0x0, 0x0, ...)
2013-10-17_07:32:25.33371       /home/julius/gosrc/src/github.com/prometheus/prometheus/main.go:131 +0x285
2013-10-17_07:32:25.33383 main.func·004()
2013-10-17_07:32:25.33387       /home/julius/gosrc/src/github.com/prometheus/prometheus/main.go:323 +0x151
2013-10-17_07:32:25.33399 created by main.main
2013-10-17_07:32:25.33403       /home/julius/gosrc/src/github.com/prometheus/prometheus/main.go:330 +0xef4
@matttproud

This comment has been minimized.

Copy link
Member

matttproud commented Oct 30, 2013

@juliusv, can you give me an updated version of this trace? I don't think the stack trace matches up with the appropriate line in the source code now.

@juliusv

This comment has been minimized.

Copy link
Member Author

juliusv commented Oct 30, 2013

@matttproud Yep, if I uncomment and run the problematic test case in the compaction regression tests now, I get this trace below. I once traced it all the way into the objective.go code which tries to seek to the right compaction starting point for a series, and at some point it needed to seek Prev() once, then Next() forward again, and when adding Println() statements to our iterator wrapper, I could see that the key didn't move forward on the Next(), only if I inserted a second Next() would the iterator key advance after the Prev(). Really strange, and I wasn't able to reproduce it in a vanilla LevelDB outside of Prometheus.

panic: runtime error: invalid memory address or nil pointer dereference [recovered]
    panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x30 pc=0x44be39]

goroutine 5 [running]:
testing.func·004()
    /usr/local/go/src/pkg/testing/testing.go:348 +0xcd
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*CompactionProcessor).Apply(0xc20034c540, 0xc2000f3a80, 0xc20034c580, 0xc2000e62a0, 0xc2003085a0, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/processor.go:235 +0x1749
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*watermarkScanner).Operate(0xc2003376e0, 0x76cdc0, 0xc20034fdb0, 0x73e0e0, 0xc200351880, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/curator.go:416 +0x9fe
github.com/prometheus/prometheus/storage/raw/leveldb.(*LevelDBPersistence).ForEach(0xc2003084e0, 0xc20028a870, 0xc2003376e0, 0xc20028a8a0, 0xc2003376e0, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/raw/leveldb/leveldb.go:490 +0x323
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*LevelDBHighWatermarker).ForEach(0xc2002fa810, 0xc20028a870, 0xc2003376e0, 0xc20028a8a0, 0xc2003376e0, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/watermark.go:106 +0x6f
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*Curator).Run(0xc20034ea40, 0xdf8475800, 0xe7c5bfd49, 0x0, 0xc5d080, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/curator.go:212 +0x767
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.compactionTestScenario.test(0x5, 0x3, 0x14, 0xdf8475800, 0x1e, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/compaction_regression_test.go:147 +0x598
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.TestCompaction(0xc2000ea090)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/compaction_regression_test.go:242 +0x108
testing.tRunner(0xc2000ea090, 0xc60800)
    /usr/local/go/src/pkg/testing/testing.go:353 +0x8a
created by testing.RunTests
    /usr/local/go/src/pkg/testing/testing.go:433 +0x86b

goroutine 1 [chan receive]:
testing.RunTests(0x834460, 0xc60800, 0x2c, 0x2c, 0xe7c5bfd01, ...)
    /usr/local/go/src/pkg/testing/testing.go:434 +0x88e
testing.Main(0x834460, 0xc60800, 0x2c, 0x2c, 0xc60040, ...)
    /usr/local/go/src/pkg/testing/testing.go:365 +0x8a
main.main()
    _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/_test/_testmain.go:205 +0x9a

goroutine 2 [syscall]:

goroutine 4 [chan receive]:
github.com/golang/glog.(*loggingT).flushDaemon(0xc76660)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:838 +0x4e
created by github.com/golang/glog.init·1
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:405 +0x274

goroutine 57 [chan receive]:
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.func·038()
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:243 +0x46
created by _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:246 +0x191

goroutine 15 [chan receive]:
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.func·038()
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:243 +0x46
created by _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:246 +0x191

goroutine 36 [chan receive]:
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.func·038()
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:243 +0x46
created by _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:246 +0x191
@matttproud

This comment has been minimized.

Copy link
Member

matttproud commented Oct 30, 2013

Regarding the staleness of the keys, could you validate that by k := new(dto.SampleKey) and i.Get(k) when doing the i.Next() that the value gets refreshed as it should or not? There may be something fishy going on the with the retention of the per-instance key

@juliusv

This comment has been minimized.

Copy link
Member Author

juliusv commented Oct 30, 2013

Well, I actually printed the raw Levigo keys as []bytes back then, and they definitely stayed the same after calling Next(). I had inserted the prints here: https://github.com/prometheus/prometheus/blob/master/storage/raw/leveldb/leveldb.go#L146 and here: https://github.com/prometheus/prometheus/blob/master/storage/raw/leveldb/leveldb.go#L154

So it can't really be related to our DTO types built ontop, right?

@juliusv

This comment has been minimized.

Copy link
Member Author

juliusv commented Oct 30, 2013

So I added debug statements again in this debug commit and uncommented the failing test: 498293f and now looking at that test crash trace, I'm actually not sure anymore whether that is the same bug as the one in this issue, or if it's yet another one, since this time the iterator keys don't exhibit said problem (actually it doesn't Prev() at all before the crash) and the crash trace looks different. So could be this is an unrelated crash bug, or one caused by the same underlying problem, but looking a bit different.

=== RUN TestCompaction
BEFORE NEXT [10 28 10 26 48 50 50 54 54 57 52 54 54 56 57 57 53 53 57 49 54 49 52 51 45 110 45 50 45 48 18 8 0 0 0 0 4 202 3 1 25 241 3 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 50 50 54 54 57 52 54 54 56 57 57 53 53 57 49 54 49 52 51 45 110 45 50 45 48 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 50 50 54 54 57 52 54 54 56 57 57 53 53 57 49 54 49 52 51 45 110 45 50 45 48 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 50 50 54 54 57 52 54 54 56 57 57 53 53 57 49 54 49 52 51 45 110 45 50 45 48 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 50 50 54 54 57 52 54 54 56 57 57 53 53 57 49 54 49 52 51 45 110 45 50 45 48 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 3 1 25 241 3 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 3 1 25 241 3 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 55 56 49 52 56 49 56 52 56 48 57 50 50 56 56 48 51 54 55 45 110 45 50 45 49 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 3 1 25 241 3 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 3 1 25 241 3 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 4 45 25 29 5 202 4 0 0 0 0 37 5 0 0 0]
AFTER NEXT  [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
BEFORE NEXT [10 28 10 26 48 57 54 49 55 54 54 53 55 48 54 55 53 52 56 50 56 54 53 53 45 110 45 50 45 50 18 8 0 0 0 0 4 202 5 89 25 73 6 202 4 0 0 0 0 37 5 0 0 0]
SIGSEGV: segmentation violation
PC=0x2b333f09c0d8
signal arrived during cgo execution

github.com/jmhodges/levigo._Cfunc_leveldb_iter_key(0x2b33600015f0, 0xc20028c408, 0x4038f0)
    github.com/jmhodges/levigo/_obj/_cgo_defun.c:205 +0x2f
github.com/jmhodges/levigo.(*Iterator).Key(0xc200276b08, 0x2, 0x2, 0xa2)
    github.com/jmhodges/levigo/_obj/batch.cgo1.go:519 +0x44
github.com/prometheus/prometheus/storage/raw/leveldb.(*levigoIterator).Next(0xc20027d400, 0x776240)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/raw/leveldb/leveldb.go:147 +0x169
github.com/prometheus/prometheus/storage/raw/leveldb.(*LevelDBPersistence).ForEach(0xc2000e6840, 0xc20027e6f0, 0xc7fe68, 0xc20027e720, 0xc7fe68, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/raw/leveldb/leveldb.go:474 +0x1be
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.checkStorageSaneAndEquivalent(0xc2000ea090, 0x7cdab0, 0x11, 0xc2000ea1b0, 0xc2000f6600, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/compaction_regression_test.go:93 +0x157
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.compactionTestScenario.run(0x5, 0x3, 0xf, 0xdf8475800, 0x1e, ...)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/compaction_regression_test.go:131 +0x291
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.TestCompaction(0xc2000ea090)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/compaction_regression_test.go:242 +0x108
testing.tRunner(0xc2000ea090, 0xc61800)
    /usr/local/go/src/pkg/testing/testing.go:353 +0x8a
created by testing.RunTests
    /usr/local/go/src/pkg/testing/testing.go:433 +0x86b

goroutine 1 [chan receive]:
testing.RunTests(0x834900, 0xc61800, 0x2c, 0x2c, 0xe7c5bfd01, ...)
    /usr/local/go/src/pkg/testing/testing.go:434 +0x88e
testing.Main(0x834900, 0xc61800, 0x2c, 0x2c, 0xc61040, ...)
    /usr/local/go/src/pkg/testing/testing.go:365 +0x8a
main.main()
    _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/_test/_testmain.go:205 +0x9a

goroutine 2 [syscall]:

goroutine 4 [chan receive]:
github.com/golang/glog.(*loggingT).flushDaemon(0xc77660)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:838 +0x4e
created by github.com/golang/glog.init·1
    /home/julius/gosrc/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:405 +0x274

goroutine 14 [select]:
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve(0xc2000ea1b0, 0xc2000e69c0)
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:250 +0x48f
created by _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.NewTestTieredStorage
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/helpers_test.go:108 +0x266

goroutine 15 [chan receive]:
_/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.func·038()
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:243 +0x46
created by _/home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve
    /home/julius/gosrc/src/github.com/prometheus/prometheus/storage/metric/tiered.go:246 +0x191
rax     0x0
rbx     0xc20028c408
rcx     0x2b335034d948
rdx     0x0
rdi     0x2b3360003d90
rsi     0xc20028c408
rbp     0xc2001bd000
rsp     0x2b335c1ffd30
r8      0x36
r9      0x3535363832383435
r10     0x0
r11     0x2b333f688d10
r12     0x0
r13     0x0
r14     0x10
r15     0x7cd430
rip     0x2b333f09c0d8
rflags  0x10206
cs      0x33
fs      0x0
gs      0x0
@grobie

This comment has been minimized.

Copy link
Member

grobie commented Nov 24, 2013

Hm, as it looks similar, I'll paste another stacktrace here:

✗ ~/code/go/src/github.com/prometheus/prometheus/.build/package/run_prometheus.sh
prometheus, version 8c08a50 (master)
  build user:       ts@grobox
  build date:       20131108-23:35:46
  go version:       1.1
  leveldb version:  1.12.0
  protobuf version: 2.5.0
  snappy version:   1.1.0
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xb code=0x1 addr=0x30 pc=0x497fb9]

goroutine 24 [running]:
github.com/prometheus/prometheus/storage/metric.(*CompactionProcessor).Apply(0xc2024e2c40, 0xc2025b4f00, 0xc2024e2cc0, 0xc201c906c0, 0xc2001216c0, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/processor.go:235 +0x1749
github.com/prometheus/prometheus/storage/metric.(*watermarkScanner).Operate(0xc202014500, 0x81bf80, 0xc20262ca20, 0x7df100, 0xc2029dc300, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/curator.go:416 +0x9fe
github.com/prometheus/prometheus/storage/raw/leveldb.(*LevelDBPersistence).ForEach(0xc200121840, 0xc2025b3150, 0xc202014500, 0xc2025b3180, 0xc202014500, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/raw/leveldb/leveldb.go:490 +0x323
github.com/prometheus/prometheus/storage/metric.(*LevelDBHighWatermarker).ForEach(0xc200000910, 0xc2025b3150, 0xc202014500, 0xc2025b3180, 0xc202014500, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/watermark.go:106 +0x6f
github.com/prometheus/prometheus/storage/metric.(*Curator).Run(0xc202ba0040, 0x45d964b800, 0xeca240efe, 0xc22dd67691, 0xdc8b20, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/curator.go:212 +0x767
main.(*prometheus).compact(0xc200274b60, 0x45d964b800, 0x1f4, 0x0, 0x0, ...)
        /home/ts/code/go/src/github.com/prometheus/prometheus/main.go:131 +0x285
main.func·003()
        /home/ts/code/go/src/github.com/prometheus/prometheus/main.go:311 +0x150
created by main.main
        /home/ts/code/go/src/github.com/prometheus/prometheus/main.go:318 +0xea4

goroutine 1 [chan receive]:
main.main()
        /home/ts/code/go/src/github.com/prometheus/prometheus/main.go:364 +0x1010

goroutine 2 [syscall]:

goroutine 4 [chan receive]:
github.com/golang/glog.(*loggingT).flushDaemon(0xdc8dc0)
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:838 +0x4e
created by github.com/golang/glog.init·1
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/golang/glog/glog.go:405 +0x274

goroutine 22 [chan receive]:
github.com/prometheus/prometheus/storage/metric.func·038()
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/tiered.go:246 +0x46
created by github.com/prometheus/prometheus/storage/metric.(*TieredStorage).Serve
        /home/ts/code/go/src/github.com/prometheus/prometheus/.build/root/gopath/src/github.com/prometheus/prometheus/storage/metric/tiered.go:249 +0x191

goroutine 7 [syscall]:
os/signal.loop()
        /usr/local/go/src/pkg/os/signal/signal_unix.go:21 +0x1c
created by os/signal.init·1
        /usr/local/go/src/pkg/os/signal/signal_unix.go:27 +0x2f

...
@juliusv

This comment has been minimized.

Copy link
Member Author

juliusv commented Dec 2, 2013

Was able to reproduce the Next/Prev bug and actually traced it to this upstream LevelDB bug: https://code.google.com/p/leveldb/issues/detail?id=200

This is fixed in the latest LevelDB release (1.14.0). I'm working on switching to that.

juliusv added a commit that referenced this issue Dec 2, 2013

Add compaction regression tests.
This adds regression tests that catch the two error cases reported in

  #367

It also adds a commented-out test case for the crash in

  #368

but there's no fix for the latter crash yet.

Change-Id: Idffefea4ed7cc281caae660bcad2e3c13ec3bd17

juliusv added a commit that referenced this issue Dec 2, 2013

Upgrade to LevelDB 1.14.0 to fix LevelDB bugs.
This tentatively fixes #368 due
to an upstream bugfix in snapshotted LevelDB iterator handling, which got fixed
in LevelDB 1.14.0:

https://code.google.com/p/leveldb/issues/detail?id=200

Change-Id: Ib0cc67b7d3dc33913a1c16736eff32ef702c63bf

@juliusv juliusv closed this in 6b7de31 Dec 3, 2013

simonpasquier pushed a commit to simonpasquier/prometheus that referenced this issue Oct 12, 2017

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 25, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 25, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.