Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More issues in tests #1351

Closed
RichiH opened this Issue Jan 26, 2016 · 6 comments

Comments

Projects
None yet
4 participants
@RichiH
Copy link
Member

RichiH commented Jan 26, 2016

New issue as it's yet another issue with tests.

I built a few times in @fabxc 's branch and just got:

>> formatting code
>> building binaries
 >   prometheus
 >   promtool
>> running tests
?       github.com/prometheus/prometheus/cmd/prometheus [no test files]
?       github.com/prometheus/prometheus/cmd/promtool   [no test files]
ok      github.com/prometheus/prometheus/config 0.068s
ok      github.com/prometheus/prometheus/notification   0.229s
ok      github.com/prometheus/prometheus/promql 3.331s
ok      github.com/prometheus/prometheus/retrieval  1.646s

SIGQUIT: quit
PC=0x45fd41 m=0

goroutine 0 [idle]:
runtime.futex(0xe7c1f0, 0x0, 0x0, 0x0, 0x0, 0xe7b800, 0x0, 0x0, 0x40e1d4, 0xe7c1f0, ...)
    /usr/lib/go/src/runtime/sys_linux_amd64.s:288 +0x21
runtime.futexsleep(0xe7c1f0, 0x0, 0xffffffffffffffff)
    /usr/lib/go/src/runtime/os1_linux.go:39 +0x53
runtime.notesleep(0xe7c1f0)
    /usr/lib/go/src/runtime/lock_futex.go:142 +0xa4
runtime.stopm()
    /usr/lib/go/src/runtime/proc1.go:1136 +0x112
runtime.findrunnable(0xc82001a000, 0x0)
    /usr/lib/go/src/runtime/proc1.go:1538 +0x69e
runtime.schedule()
    /usr/lib/go/src/runtime/proc1.go:1647 +0x267
runtime.goexit0(0xc8201d0300)
    /usr/lib/go/src/runtime/proc1.go:1773 +0x1a2
runtime.mcall(0x7ffe214019d0)
    /usr/lib/go/src/runtime/asm_amd64.s:204 +0x5b

goroutine 1 [chan receive]:
testing.RunTests(0xc1c558, 0xe77000, 0x7, 0x7, 0xc820017801)
    /usr/lib/go/src/testing/testing.go:562 +0x8ad
testing.(*M).Run(0xc8200b9ee8, 0xc1ea60)
    /usr/lib/go/src/testing/testing.go:494 +0x70
main.main()
    github.com/prometheus/prometheus/retrieval/discovery/_test/_testmain.go:66 +0x116

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
    /usr/lib/go/src/runtime/asm_amd64.s:1721 +0x1

goroutine 39 [chan send]:
github.com/prometheus/prometheus/retrieval/discovery.(*MarathonDiscovery).updateServices(0xc820167f18, 0xc82017c420, 0x0, 0x0)
    /home/richih/work/go/src/github.com/prometheus/prometheus/retrieval/discovery/marathon.go:80 +0x118
github.com/prometheus/prometheus/retrieval/discovery.TestMarathonSDSendGroup(0xc820198360)
    /home/richih/work/go/src/github.com/prometheus/prometheus/retrieval/discovery/marathon_test.go:114 +0x234
testing.tRunner(0xc820198360, 0xe77048)
    /usr/lib/go/src/testing/testing.go:456 +0x98
created by testing.RunTests
    /usr/lib/go/src/testing/testing.go:561 +0x86d

rax    0xca
rbx    0x0
rcx    0x45fd43
rdx    0x0
rdi    0xe7c1f0
rsi    0x0
rbp    0x1
rsp    0x7ffe21401828
r8     0x0
r9     0x0
r10    0x0
r11    0x286
r12    0x3
r13    0xc18c8c
r14    0x55555555555555
r15    0x38
rip    0x45fd41
rflags 0x286
cs     0x33
fs     0x0
gs     0x0
*** Test killed with quit: ran too long (10m0s).
FAIL    github.com/prometheus/prometheus/retrieval/discovery    600.013s
?       github.com/prometheus/prometheus/retrieval/discovery/kubernetes [no test files]
?       github.com/prometheus/prometheus/retrieval/discovery/marathon   [no test files]
ok      github.com/prometheus/prometheus/rules  0.185s
?       github.com/prometheus/prometheus/storage    [no test files]
ok      github.com/prometheus/prometheus/storage/local  27.385s
ok      github.com/prometheus/prometheus/storage/local/codable  0.012s
?       github.com/prometheus/prometheus/storage/local/index    [no test files]
ok      github.com/prometheus/prometheus/storage/metric 0.011s
ok      github.com/prometheus/prometheus/storage/remote 0.059s
ok      github.com/prometheus/prometheus/storage/remote/graphite    0.015s
ok      github.com/prometheus/prometheus/storage/remote/influxdb    0.024s
ok      github.com/prometheus/prometheus/storage/remote/opentsdb    0.054s
ok      github.com/prometheus/prometheus/template   0.119s
?       github.com/prometheus/prometheus/util/cli   [no test files]
ok      github.com/prometheus/prometheus/util/flock 0.056s
?       github.com/prometheus/prometheus/util/httputil  [no test files]
?       github.com/prometheus/prometheus/util/stats [no test files]
ok      github.com/prometheus/prometheus/util/strutil   0.011s
?       github.com/prometheus/prometheus/util/testutil  [no test files]
?       github.com/prometheus/prometheus/util/treecache [no test files]
?       github.com/prometheus/prometheus/version    [no test files]
ok      github.com/prometheus/prometheus/web    0.034s
ok      github.com/prometheus/prometheus/web/api/legacy 0.116s
ok      github.com/prometheus/prometheus/web/api/v1 0.093s
?       github.com/prometheus/prometheus/web/ui [no test files]
Makefile:29: recipe for target 'test' failed
make: *** [test] Error 1

@beorn7 beorn7 added the bug label Feb 2, 2016

@beorn7

This comment has been minimized.

Copy link
Member

beorn7 commented Mar 10, 2016

The code that throws up here has been touched quite a lot recently.

@RichiH Is this still happening?

@RichiH

This comment has been minimized.

Copy link
Member Author

RichiH commented Mar 11, 2016

I am having issues with the local hardware, I hope to get those fixed over the weekend...

@fabxc

This comment has been minimized.

Copy link
Member

fabxc commented Apr 6, 2016

This deadlock seems to be the consequence of issues several SD implementations are suffering from for a while now. They are sending on channels for which the consumer may disappear at any time. They must detect that by listening on <-context.Ctx.Done() (formerly simply <-done) simultaniously and everywhere.

In a regular running Prometheus instance this "just" causes very few leaking goroutines – so it only had limited impact so far – plus, it rarely happens.

@fabxc fabxc added kind/bug and removed bug labels Apr 28, 2016

@brian-brazil

This comment has been minimized.

Copy link
Member

brian-brazil commented Jul 8, 2016

@fabxc I believe you've resolved these?

@fabxc

This comment has been minimized.

Copy link
Member

fabxc commented Jul 8, 2016

Yup

@fabxc fabxc closed this Jul 8, 2016

@lock

This comment has been minimized.

Copy link

lock bot commented Mar 24, 2019

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.

@lock lock bot locked and limited conversation to collaborators Mar 24, 2019

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
You can’t perform that action at this time.