Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools: freebsd-amd64-race builder is failing consistently #36444

Open
dmitshur opened this issue Jan 7, 2020 · 4 comments
Open

x/tools: freebsd-amd64-race builder is failing consistently #36444

dmitshur opened this issue Jan 7, 2020 · 4 comments

Comments

@dmitshur
Copy link
Member

@dmitshur dmitshur commented Jan 7, 2020

The linux-amd64-race builder is currently configured with 15 GB RAM (via the n1-standard-4 machine type). CL 213557 has recently increased the RAM for windows-amd64-race builder to 14.4 GB in order to work around 7.2 GB not being sufficient, as described at #35186 (comment).

The freebsd-amd64-race builder is currently configured to use the n1-highcpu-4 machine type, which has 3.6 GB RAM. The freebsd-amd64-race builder is failing quite often, which may be connected to insufficient memory:


signal: killed
FAIL	golang.org/x/tools/go/packages	217.792s
*** Test killed with quit: ran too long (10m0s).
FAIL	golang.org/x/tools/internal/imports	600.067s
panic: test timed out after 10m0s
[...]
FAIL	golang.org/x/tools/go/packages	611.592s
[...]
	package = golang.org/x/tools/internal/lsp/protocol
2020/01/07 18:44:29 Error:2020/01/07 18:44:29 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
2020/01/07 18:44:35 no dep handle: no metadata for golang.org/x/tools/internal/lsp/protocol
	package = golang.org/x/tools/internal/lsp/protocol
2020/01/07 18:44:35 Error:2020/01/07 18:44:35 no dep handle: no metadata for golang.org/x/tools/internal/lsp/protocol
	package = golang.org/x/tools/internal/lsp/protocol
2020/01/07 18:44:36 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
2020/01/07 18:44:36 Error:2020/01/07 18:44:36 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
2020/01/07 18:44:42 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
2020/01/07 18:44:43 Error:2020/01/07 18:44:42 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
--- FAIL: TestCommandLine (240.28s)
    --- FAIL: TestCommandLine/Modules (194.21s)
        --- FAIL: TestCommandLine/Modules/Diagnostics (159.73s)
            --- FAIL: TestCommandLine/Modules/Diagnostics/testy_test (151.73s)
                check.go:62: missing diagnostic "testy/testy_test.go:6:6: x declared but not used", map[]
FAIL
FAIL	golang.org/x/tools/internal/lsp/cmd	240.922s
2020/01/07 20:54:34 no dep handle: no metadata for nosuchpkg
	package = nosuchpkg
2020/01/07 20:54:51 no dep handle: no metadata for golang.org/x/tools/internal/lsp/protocol
	package = golang.org/x/tools/internal/lsp/protocol
2020/01/07 20:54:51 no dep handle: no metadata for golang.org/x/xerrors
	package = golang.org/x/xerrors
FAIL	golang.org/x/tools/internal/lsp/source	246.068s

If there aren't objections, I plan to bump its up RAM to be more similar in size (without exceeding the current value of 15 GB RAM used both by Linux and now Windows -race builders) to see it helps with the test failures and issues like #34621.

/cc @bradfitz @bcmills @matloob @toothrot

@gopherbot

This comment has been minimized.

Copy link

@gopherbot gopherbot commented Jan 13, 2020

Change https://golang.org/cl/214433 mentions this issue: dashboard: upsize freebsd-amd64-race builder to 7.2 GB RAM

@dmitshur dmitshur self-assigned this Jan 13, 2020
gopherbot pushed a commit to golang/build that referenced this issue Jan 14, 2020
Start using n1-highcpu-8 machine type instead of n1-highcpu-4
for the freebsd-amd64-race builder.

The freebsd-amd64-race builder has produced good test results
for the x/tools repo for a long time, but by now it has started
to consistently fail for reasons that seem connected to it having
only 3.6 GB memory. The Windows race builders needed to be bumped
from 7.2 GB to 14.4 GB to run successfully, so this change makes
a small incremental step to bring freebsd-amd64-race closer in
line with other builders. If memory-related problems continue to
occur with 7.2 GB, the next step will be to go up to 14.4 GB.

The freebsd-amd64-race builder is using an older version of FreeBSD.
We may want to start using a newer one for running tests with -race,
but that should be a separate change so we can see the results of
this change without another confounding variable.

Also update all FreeBSD builders to use https in buildletURLTmpl,
because it's expected to work fine and will be more consistent.

Updates golang/go#36444
Updates golang/go#34621
Updates golang/go#29252
Updates golang/go#33986

Change-Id: Idfcefd1c91bddc9f70ab23e02fcdca54fda9d1ac
Reviewed-on: https://go-review.googlesource.com/c/build/+/214433
Run-TryBot: Carlos Amedee <carlos@golang.org>
TryBot-Result: Gobot Gobot <gobot@golang.org>
Reviewed-by: Carlos Amedee <carlos@golang.org>
@dmitshur

This comment has been minimized.

Copy link
Member Author

@dmitshur dmitshur commented Jan 14, 2020

CL 214433 was deployed this morning, and the freebsd-amd64-race builder has been passing on x/tools since then:

I'll give it more time to gather more data, and close this issue if the reliable build results continue.

@dmitshur

This comment has been minimized.

Copy link
Member Author

@dmitshur dmitshur commented Jan 16, 2020

The freebsd-amd64-race builder no longer failing consistently on x/tools. However, there are still occasional failures. Need to see if they are connected to limited memory, and if so, consider going up to 15 GB on the FreeBSD race builder to match the memory of Linux and Windows race builders.

Edit: Many of the failures are due to data races being detected. Filed issue #36605.

@bcmills

This comment has been minimized.

Copy link
Member

@bcmills bcmills commented Feb 20, 2020

The failures in https://build.golang.org/log/74eea9d200fd346319a2595c2f2b2ab3f25c2a07 appear to be due to timeouts rather than data races, although it is possible that a data race failure is being masked by failure to propagate logs somewhere higher up the stack.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
3 participants
You can’t perform that action at this time.