Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic: runtime error: index out of range with kubernetes_sd_config #1435

Closed
pdbogen opened this Issue Mar 1, 2016 · 3 comments

Comments

Projects
None yet
3 participants
@pdbogen
Copy link
Contributor

pdbogen commented Mar 1, 2016

The config I'm using is very basic:

scrape_configs:
- job_name: 'statsd-exporter'
  kubernetes_sd_configs:
  - api_servers:
    - 'https://kubernetes.default'
    in_cluster: true

It results in a crash:

/ # prometheus -config.file=test.yml
prometheus, version 0.16.2 (branch: stable, revision: 287d9b2)
  build user:       @ee15eaddc546
  build date:       20160226-07:13:15
  go version:       1.5.3
INFO[0000] Loading configuration file test.yml           source=main.go:196
INFO[0000] Loading series map and head chunks...         source=storage.go:268
INFO[0000] 0 series loaded.                              source=storage.go:273
INFO[0000] Starting target manager...                    source=targetmanager.go:114
INFO[0000] Listening on :9090                            source=web.go:220
panic: runtime error: index out of range

goroutine 147 [running]:
github.com/prometheus/prometheus/retrieval/discovery/kubernetes.(*Discovery).updateNodesTargetGroup(0xc82038a000, 0x0)
    /gopath/src/github.com/prometheus/prometheus/retrieval/discovery/kubernetes/discovery.go:323 +0x807
github.com/prometheus/prometheus/retrieval/discovery/kubernetes.(*Discovery).Run(0xc82038a000, 0xc821614000, 0xc8204c8480)
    /gopath/src/github.com/prometheus/prometheus/retrieval/discovery/kubernetes/discovery.go:185 +0x14b
created by github.com/prometheus/prometheus/retrieval.(*prefixedTargetProvider).Run
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:383 +0xcc

goroutine 1 [select]:
main.Main(0x0)
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:176 +0x1ae6
main.main()
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:46 +0x18

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
    /usr/local/go/src/runtime/asm_amd64.s:1721 +0x1

goroutine 109 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mpoolDrain(0xc820540c60)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_state.go:82 +0x14b
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:140 +0x7a1

goroutine 19 [syscall]:
os/signal.loop()
    /usr/local/go/src/os/signal/signal_unix.go:22 +0x18
created by os/signal.init.1
    /usr/local/go/src/os/signal/signal_unix.go:28 +0x37

goroutine 108 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).compactionError(0xc820540c60)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:69 +0x54a
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:139 +0x77f

goroutine 83 [select, locked to thread]:
runtime.gopark(0xfb1e40, 0xc820022f28, 0xde69c8, 0x6, 0x435018, 0x2)
    /usr/local/go/src/runtime/proc.go:185 +0x163
runtime.selectgoImpl(0xc820022f28, 0x0, 0x18)
    /usr/local/go/src/runtime/select.go:392 +0xa64
runtime.selectgo(0xc820022f28)
    /usr/local/go/src/runtime/select.go:212 +0x12
runtime.ensureSigM.func1()
    /usr/local/go/src/runtime/signal1_unix.go:227 +0x353
runtime.goexit()
    /usr/local/go/src/runtime/asm_amd64.s:1721 +0x1

goroutine 84 [select]:
main.Main.func2(0xc8204c8ae0, 0xc8204c8a80, 0xc82008ec00, 0xc8204c2c30, 0x5, 0x5)
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:127 +0x134
created by main.Main
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:133 +0xebd

goroutine 85 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).drain(0xc8200c4620)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:206 +0x29d
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.NewBufferPool
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:237 +0x26b

goroutine 86 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).compactionError(0xc820540580)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:69 +0x54a
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:139 +0x77f

goroutine 87 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mpoolDrain(0xc820540580)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_state.go:82 +0x14b
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:140 +0x7a1

goroutine 88 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).tCompaction(0xc820540580)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:768 +0x7c8
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:146 +0x9a5

goroutine 89 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mCompaction(0xc820540580)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:715 +0x253
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:147 +0x9c7

goroutine 90 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).jWriter(0xc820540580)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_write.go:37 +0x1a2
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:148 +0x9e9

goroutine 91 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).drain(0xc82040e0e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:206 +0x29d
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.NewBufferPool
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:237 +0x26b

goroutine 92 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).compactionError(0xc8205406e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:69 +0x54a
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:139 +0x77f

goroutine 93 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mpoolDrain(0xc8205406e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_state.go:82 +0x14b
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:140 +0x7a1

goroutine 94 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).tCompaction(0xc8205406e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:768 +0x7c8
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:146 +0x9a5

goroutine 95 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mCompaction(0xc8205406e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:715 +0x253
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:147 +0x9c7

goroutine 96 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).jWriter(0xc8205406e0)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_write.go:37 +0x1a2
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:148 +0x9e9

goroutine 97 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).drain(0xc82040e460)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:206 +0x29d
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.NewBufferPool
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:237 +0x26b

goroutine 114 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).compactionError(0xc820540840)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:69 +0x54a
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:139 +0x77f

goroutine 115 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mpoolDrain(0xc820540840)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_state.go:82 +0x14b
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:140 +0x7a1

goroutine 116 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).tCompaction(0xc820540840)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:768 +0x7c8
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:146 +0x9a5

goroutine 117 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mCompaction(0xc820540840)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:715 +0x253
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:147 +0x9c7

goroutine 118 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).jWriter(0xc820540840)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_write.go:37 +0x1a2
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:148 +0x9e9

goroutine 119 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).drain(0xc82040e620)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:206 +0x29d
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util.NewBufferPool
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:237 +0x26b

goroutine 110 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).tCompaction(0xc820540c60)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:768 +0x7c8
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:146 +0x9a5

goroutine 111 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).mCompaction(0xc820540c60)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_compaction.go:715 +0x253
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:147 +0x9c7

goroutine 112 [select]:
github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.(*DB).jWriter(0xc820540c60)
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db_write.go:37 +0x1a2
created by github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb.openDB
    /gopath/src/github.com/prometheus/prometheus/vendor/github.com/syndtr/goleveldb/leveldb/db.go:148 +0x9e9

goroutine 113 [select]:
github.com/prometheus/prometheus/storage/local.(*persistence).processIndexingQueue(0xc82025b600)
    /gopath/src/github.com/prometheus/prometheus/storage/local/persistence.go:1364 +0xcaf
github.com/prometheus/prometheus/storage/local.(*persistence).run(0xc82025b600)
    /gopath/src/github.com/prometheus/prometheus/storage/local/persistence.go:287 +0x21
created by github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).Start
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:258 +0x146

goroutine 130 [select]:
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).handleEvictList(0xc8200d43c0)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:665 +0x52d
created by github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).Start
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:281 +0x4ae

goroutine 131 [select]:
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).loop(0xc8200d43c0)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:882 +0x59b
created by github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).Start
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:282 +0x4d3

goroutine 136 [select]:
github.com/prometheus/prometheus/rules.(*Manager).Run(0xc820017680)
    /gopath/src/github.com/prometheus/prometheus/rules/manager.go:155 +0x3d6
created by main.Main
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:158 +0x129b

goroutine 120 [select]:
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).waitForNextFP(0xc8200d43c0, 0x0, 0x3ff0000000000000, 0xc8204c8300)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:769 +0x19a
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).cycleThroughMemoryFingerprints.func1(0xc8204c8300, 0xc8200d43c0)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:796 +0xc3
created by github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).cycleThroughMemoryFingerprints
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:819 +0x5d

goroutine 121 [select]:
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).waitForNextFP(0xc8200d43c0, 0x0, 0x3ff0000000000000, 0x0)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:769 +0x19a
github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).cycleThroughArchivedFingerprints.func1(0xc8204c8360, 0xc8200d43c0)
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:842 +0x2bc
created by github.com/prometheus/prometheus/storage/local.(*memorySeriesStorage).cycleThroughArchivedFingerprints
    /gopath/src/github.com/prometheus/prometheus/storage/local/storage.go:862 +0x5d

goroutine 137 [chan receive]:
github.com/prometheus/prometheus/notification.(*NotificationHandler).Run(0xc8200efd60)
    /gopath/src/github.com/prometheus/prometheus/notification/notification.go:208 +0xa3
created by main.Main
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:161 +0x12eb

goroutine 139 [IO wait]:
net.runtime_pollWait(0x7f34c8cc77c0, 0x72, 0xc820010150)
    /usr/local/go/src/runtime/netpoll.go:157 +0x60
net.(*pollDesc).Wait(0xc820196ae0, 0x72, 0x0, 0x0)
    /usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a
net.(*pollDesc).WaitRead(0xc820196ae0, 0x0, 0x0)
    /usr/local/go/src/net/fd_poll_runtime.go:78 +0x36
net.(*netFD).accept(0xc820196a80, 0x0, 0x7f34c8cc78b8, 0xc8204c65a0)
    /usr/local/go/src/net/fd_unix.go:408 +0x27c
net.(*TCPListener).AcceptTCP(0xc820024030, 0xc8215c1d68, 0x0, 0x0)
    /usr/local/go/src/net/tcpsock_posix.go:254 +0x4d
net/http.tcpKeepAliveListener.Accept(0xc820024030, 0x0, 0x0, 0x0, 0x0)
    /usr/local/go/src/net/http/server.go:2135 +0x41
net/http.(*Server).Serve(0xc8204009c0, 0x7f34c8cc7880, 0xc820024030, 0x0, 0x0)
    /usr/local/go/src/net/http/server.go:1887 +0xb3
net/http.(*Server).ListenAndServe(0xc8204009c0, 0x0, 0x0)
    /usr/local/go/src/net/http/server.go:1877 +0x136
net/http.ListenAndServe(0xdddbc8, 0x5, 0x7f34c8cc67e8, 0xc8201980e0, 0x0, 0x0)
    /usr/local/go/src/net/http/server.go:1967 +0x8f
github.com/prometheus/prometheus/web.(*Handler).Run(0xc82008ec00)
    /gopath/src/github.com/prometheus/prometheus/web/web.go:221 +0x144
created by main.Main
    /gopath/src/github.com/prometheus/prometheus/cmd/prometheus/main.go:169 +0x13b6

goroutine 142 [select]:
github.com/prometheus/prometheus/retrieval.(*TargetManager).Run.func2(0xc820401f20, 0xc820401ec0, 0xc8204be000, 0xc8204c8480)
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:141 +0x210
created by github.com/prometheus/prometheus/retrieval.(*TargetManager).Run
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:148 +0x4f2

goroutine 140 [IO wait]:
net.runtime_pollWait(0x7f34c8cc7700, 0x72, 0xc820010150)
    /usr/local/go/src/runtime/netpoll.go:157 +0x60
net.(*pollDesc).Wait(0xc820100840, 0x72, 0x0, 0x0)
    /usr/local/go/src/net/fd_poll_runtime.go:73 +0x3a
net.(*pollDesc).WaitRead(0xc820100840, 0x0, 0x0)
    /usr/local/go/src/net/fd_poll_runtime.go:78 +0x36
net.(*netFD).Read(0xc8201007e0, 0xc8215de000, 0x2000, 0x2000, 0x0, 0x7f34cb2ea050, 0xc820010150)
    /usr/local/go/src/net/fd_unix.go:232 +0x23a
net.(*conn).Read(0xc8204d20c0, 0xc8215de000, 0x2000, 0x2000, 0x0, 0x0, 0x0)
    /usr/local/go/src/net/net.go:172 +0xe4
crypto/tls.(*block).readFromUntil(0xc820367e30, 0x7f34cb2f6218, 0xc8204d20c0, 0x5, 0x0, 0x0)
    /usr/local/go/src/crypto/tls/conn.go:455 +0xcc
crypto/tls.(*Conn).readRecord(0xc82007e000, 0xfb1f17, 0x0, 0x0)
    /usr/local/go/src/crypto/tls/conn.go:540 +0x2d1
crypto/tls.(*Conn).Read(0xc82007e000, 0xc8200e0000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
    /usr/local/go/src/crypto/tls/conn.go:901 +0x167
net/http.noteEOFReader.Read(0x7f34c8cc7d70, 0xc82007e000, 0xc82009ad68, 0xc8200e0000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
    /usr/local/go/src/net/http/transport.go:1370 +0x67
net/http.(*noteEOFReader).Read(0xc8201982c0, 0xc8200e0000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
    <autogenerated>:126 +0xd0
bufio.(*Reader).fill(0xc820401080)
    /usr/local/go/src/bufio/bufio.go:97 +0x1e9
bufio.(*Reader).Peek(0xc820401080, 0x1, 0x0, 0x0, 0x0, 0x0, 0x0)
    /usr/local/go/src/bufio/bufio.go:132 +0xcc
net/http.(*persistConn).readLoop(0xc82009ad10)
    /usr/local/go/src/net/http/transport.go:876 +0xf7
created by net/http.(*Transport).dialConn
    /usr/local/go/src/net/http/transport.go:685 +0xc78

goroutine 143 [chan receive]:
github.com/prometheus/prometheus/retrieval.merge.func1(0xc820401f20)
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:83 +0xc6
created by github.com/prometheus/prometheus/retrieval.merge
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:94 +0x151

goroutine 141 [select]:
net/http.(*persistConn).writeLoop(0xc82009ad10)
    /usr/local/go/src/net/http/transport.go:1009 +0x40c
created by net/http.(*Transport).dialConn
    /usr/local/go/src/net/http/transport.go:686 +0xc9d

goroutine 144 [semacquire]:
sync.runtime_Semacquire(0xc82160f1dc)
    /usr/local/go/src/runtime/sema.go:43 +0x26
sync.(*WaitGroup).Wait(0xc82160f1d0)
    /usr/local/go/src/sync/waitgroup.go:126 +0xb4
github.com/prometheus/prometheus/retrieval.merge.func2(0xc82160f1d0, 0xc820401f80)
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:99 +0x21
created by github.com/prometheus/prometheus/retrieval.merge
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:101 +0x198

goroutine 145 [select]:
github.com/prometheus/prometheus/retrieval.(*TargetManager).handleUpdates(0xc820013b30, 0xc820401f80, 0xc8204c8480)
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:174 +0x36f
created by github.com/prometheus/prometheus/retrieval.(*TargetManager).Run
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:154 +0x5ac

goroutine 146 [runnable]:
github.com/prometheus/prometheus/retrieval.(*prefixedTargetProvider).Run(0xc8200ffc00, 0xc820401ec0, 0xc8204c8480)
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:386 +0x2b8
created by github.com/prometheus/prometheus/retrieval.(*TargetManager).Run.func1
    /gopath/src/github.com/prometheus/prometheus/retrieval/targetmanager.go:132 +0x51

@brian-brazil brian-brazil added the bug label Mar 1, 2016

@jimmidyson

This comment has been minimized.

Copy link
Member

jimmidyson commented Mar 1, 2016

Looking at the code this indicates that your nodes don't have any address values. Can you share the output of kubectl get nodes -oyaml?

We should also check this & log it rather than panic (obviously).

@pdbogen

This comment has been minimized.

Copy link
Contributor Author

pdbogen commented Mar 1, 2016

Here's the (redacted out of an extreme abundance of caution) output of kubectl get nodes -oyaml. All of my nodes except the master node do have an "address" as part of the status:

apiVersion: v1
items:
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-01-29T20:30:35Z
    labels:
      kubernetes.io/hostname: k8s-worker-5
    name: ip-10-0-3-112.ec2.internal
    resourceVersion: "8511990"
    selfLink: /api/v1/nodes/ip-10-0-3-112.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    providerID: REDACTED
  status:
    addresses:
    - address: 10.0.3.112
      type: InternalIP
    - address: 10.0.3.112
      type: LegacyHostIP
    capacity:
      cpu: "2"
      memory: 15669404Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:51:05Z
      lastTransitionTime: 2016-02-16T20:23:49Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:51:05Z
      lastTransitionTime: 2016-03-01T00:00:09Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.9.1
      kernelVersion: 4.3.3-coreos-r1
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 921.0.0
      systemUUID: REDACTED
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-01-16T06:07:58Z
    labels:
      kubernetes.io/hostname: k8s-worker-3
    name: ip-10-0-3-125.ec2.internal
    resourceVersion: "8511987"
    selfLink: /api/v1/nodes/ip-10-0-3-125.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    providerID: REDACTED
  status:
    addresses:
    - address: 10.0.3.125
      type: InternalIP
    - address: 10.0.3.125
      type: LegacyHostIP
    capacity:
      cpu: "2"
      memory: 8178876Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:51:03Z
      lastTransitionTime: 2016-02-16T19:47:01Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:51:03Z
      lastTransitionTime: 2016-02-16T19:42:34Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.9.1
      kernelVersion: 4.3.3-coreos-r1
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 921.0.0
      systemUUID: REDACTED
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-01-20T01:35:32Z
    labels:
      kubernetes.io/hostname: k8s-worker-4
    name: ip-10-0-3-130.ec2.internal
    resourceVersion: "8511991"
    selfLink: /api/v1/nodes/ip-10-0-3-130.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    providerID: REDACTED
  status:
    addresses:
    - address: 10.0.3.130
      type: InternalIP
    - address: 10.0.3.130
      type: LegacyHostIP
    capacity:
      cpu: "2"
      memory: 7662784Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:51:05Z
      lastTransitionTime: 2016-01-28T01:47:18Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:51:05Z
      lastTransitionTime: 2016-02-16T19:42:41Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.9.1
      kernelVersion: 4.3.3-coreos-r1
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 921.0.0
      systemUUID: REDACTED
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-01-16T00:13:16Z
    labels:
      kubernetes.io/hostname: k8s-worker-2
    name: ip-10-0-3-183.ec2.internal
    resourceVersion: "8511983"
    selfLink: /api/v1/nodes/ip-10-0-3-183.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    providerID: REDACTED
  status:
    addresses:
    - address: 10.0.3.183
      type: InternalIP
    - address: 10.0.3.183
      type: LegacyHostIP
    capacity:
      cpu: "1"
      memory: 3858100Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:50:58Z
      lastTransitionTime: 2016-02-16T19:46:52Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:50:58Z
      lastTransitionTime: 2016-02-27T00:13:56Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.9.1
      kernelVersion: 4.3.3-coreos-r1
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 921.0.0
      systemUUID: REDACTED
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-01-15T05:37:06Z
    labels:
      kubernetes.io/hostname: k8s-worker-1
    name: ip-10-0-3-241.ec2.internal
    resourceVersion: "8511984"
    selfLink: /api/v1/nodes/ip-10-0-3-241.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    providerID: REDACTED
  status:
    addresses:
    - address: 10.0.3.241
      type: InternalIP
    - address: 10.0.3.241
      type: LegacyHostIP
    capacity:
      cpu: "2"
      memory: 8178876Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:50:59Z
      lastTransitionTime: 2016-01-27T22:57:40Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:50:59Z
      lastTransitionTime: 2016-02-18T17:21:11Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.9.1
      kernelVersion: 4.3.3-coreos-r1
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 921.0.0
      systemUUID: REDACTED
- apiVersion: v1
  kind: Node
  metadata:
    creationTimestamp: 2016-02-16T19:37:36Z
    labels:
      kubernetes.io/controller: "true"
      kubernetes.io/hostname: 10.0.3.6
    name: ip-10-0-3-6.ec2.internal
    resourceVersion: "8511988"
    selfLink: /api/v1/nodes/ip-10-0-3-6.ec2.internal
    uid: REDACTED
  spec:
    externalID: REDACTED
    unschedulable: true
  status:
    capacity:
      cpu: "2"
      memory: 7662684Ki
      pods: "40"
    conditions:
    - lastHeartbeatTime: 2016-03-01T18:51:03Z
      lastTransitionTime: 2016-02-16T19:37:44Z
      message: kubelet has sufficient disk space available
      reason: KubeletHasSufficientDisk
      status: "False"
      type: OutOfDisk
    - lastHeartbeatTime: 2016-03-01T18:51:03Z
      lastTransitionTime: 2016-02-16T19:37:44Z
      message: kubelet is posting ready status
      reason: KubeletReady
      status: "True"
      type: Ready
    daemonEndpoints:
      kubeletEndpoint:
        Port: 10250
    nodeInfo:
      bootID: REDACTED
      containerRuntimeVersion: docker://1.10.0
      kernelVersion: 4.4.1-coreos
      kubeProxyVersion: v1.1.8
      kubeletVersion: v1.1.8
      machineID: REDACTED
      osImage: CoreOS 955.0.0
      systemUUID: REDACTED
kind: List
metadata: {}
@lock

This comment has been minimized.

Copy link

lock bot commented Mar 24, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 24, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.