Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools/go/ssa: requires tons of memory for testing #14113

Open
mikioh opened this Issue Jan 27, 2016 · 7 comments

Comments

Projects
None yet
5 participants
@mikioh
Copy link
Contributor

mikioh commented Jan 27, 2016

For example, on linux-amd64-nocgo (http://build.golang.org/log/00d8b3a0c6186837e4e3d83cc06a69797734f202),

ok      golang.org/x/tools/go/pointer   57.885s
fatal error: runtime: out of memory

runtime stack:
runtime.throw(0x85dd00, 0x16)
    /tmp/workdir/go/src/runtime/panic.go:527 +0x90
runtime.sysMap(0xc8628c0000, 0x100000, 0xbd100, 0xa14418)
    /tmp/workdir/go/src/runtime/mem_linux.go:203 +0x9b
runtime.mHeap_SysAlloc(0x9f4940, 0x100000, 0x0)
    /tmp/workdir/go/src/runtime/malloc.go:426 +0x160
runtime.mHeap_Grow(0x9f4940, 0x8, 0x0)
    /tmp/workdir/go/src/runtime/mheap.go:628 +0x63
runtime.mHeap_AllocSpanLocked(0x9f4940, 0x1, 0x7f0aeb59d390)
    /tmp/workdir/go/src/runtime/mheap.go:532 +0x5f1
runtime.mHeap_Alloc_m(0x9f4940, 0x1, 0x9, 0x7f0aeb59d390)
    /tmp/workdir/go/src/runtime/mheap.go:425 +0x1ac
runtime.mHeap_Alloc.func1()
    /tmp/workdir/go/src/runtime/mheap.go:484 +0x41
runtime.systemstack(0xc820167e50)
    /tmp/workdir/go/src/runtime/asm_amd64.s:278 +0xab
runtime.mHeap_Alloc(0x9f4940, 0x1, 0x10000000009, 0x40db44)
    /tmp/workdir/go/src/runtime/mheap.go:485 +0x63
runtime.mCentral_Grow(0x9fc778, 0x0)
    /tmp/workdir/go/src/runtime/mcentral.go:190 +0x93
runtime.mCentral_CacheSpan(0x9fc778, 0x7f0aeb59d390)
    /tmp/workdir/go/src/runtime/mcentral.go:86 +0x4d4
runtime.mCache_Refill(0x7f0aec5a3000, 0x9, 0x7f0aeb59d390)
    /tmp/workdir/go/src/runtime/mcache.go:118 +0xcf
runtime.mallocgc.func2()
    /tmp/workdir/go/src/runtime/malloc.go:614 +0x2b
runtime.systemstack(0xc820016000)
    /tmp/workdir/go/src/runtime/asm_amd64.s:262 +0x79
runtime.mstart()
    /tmp/workdir/go/src/runtime/proc1.go:668

(snip)

created by golang.org/x/tools/go/ssa.(*Program).Build
    /tmp/workdir/gopath/src/golang.org/x/tools/go/ssa/builder.go:2248 +0x12a
FAIL    golang.org/x/tools/go/ssa   28.128s

@mikioh mikioh added this to the Unreleased milestone Jan 27, 2016

@bradfitz

This comment has been minimized.

Copy link
Member

bradfitz commented Jan 27, 2016

/cc @alandonovan @griesemer

As I told Robert, these machines have 1.8GB of memory to do nothing but run all.bash, essentially.

@griesemer

This comment has been minimized.

Copy link
Contributor

griesemer commented Jan 27, 2016

The question though is: Do these machines have less memory than the others?
Because these tests run fine on the other platforms.

  • gri

On Tue, Jan 26, 2016 at 11:29 PM, Brad Fitzpatrick <notifications@github.com

wrote:

/cc @alandonovan https://github.com/alandonovan @griesemer
https://github.com/griesemer

As I told Robert, these machines have 1.8GB of memory to do nothing but
run all.bash, essentially.


Reply to this email directly or view it on GitHub
#14113 (comment).

@bradfitz

This comment has been minimized.

Copy link
Member

bradfitz commented Jan 27, 2016

Nope. Same. https://github.com/golang/build/blob/master/dashboard/builders.go

The race builders are bigger, but the rest are the same.

@bradfitz

This comment has been minimized.

Copy link
Member

bradfitz commented Jan 27, 2016

I can reproduce this on Linux with no special care on my development machine (not a builder).

Our tests should not require multiple gigabytes of memory.

I'm going to disable it.

@alandonovan, @griesemer, if these tests are important, put them on a diet. I don't know what they're doing, but it's too much.

@bradfitz

This comment has been minimized.

Copy link
Member

bradfitz commented Jan 27, 2016

I saw it running two tests at once, one using 1.2GB and one using 700+MB. That's the limit of the machine. If the Linux builders have no swap, that explains their death. It's possible other machines are just swapping a bunch.

@cespare

This comment has been minimized.

Copy link
Contributor

cespare commented Jan 28, 2016

The only heavyweight test is TestStdlib which "runs the SSA builder in sanity-checking mode on all packages beneath $GOROOT". That test uses 1.3GB to run on my Linux machine.

Perhaps that test should use a small subset of the stdlib packages by default.

Alternatively, the test could collect its statistics package-by-package rather than loading all packages at once, but it does do one global function uniqueness check that would be incompatible with this approach.

@griesemer

This comment has been minimized.

Copy link
Contributor

griesemer commented Jan 28, 2016

We should just have a -short version: In that case, the test runs just over
say $GOROOT/src/go.

  • gri

On Wed, Jan 27, 2016 at 4:06 PM, Caleb Spare notifications@github.com
wrote:

The only heavyweight test is TestStdlib which "runs the SSA builder in
sanity-checking mode on all packages beneath $GOROOT". That test uses 1.3GB
to run on my Linux machine.

Perhaps that test should use a small subset of the stdlib packages by
default.

Alternatively, the test could collect its statistics package-by-package
rather than loading all packages at once, but it does do one global
function uniqueness check that would be incompatible with this approach.


Reply to this email directly or view it on GitHub
#14113 (comment).

gopherbot pushed a commit to golang/tools that referenced this issue Jan 28, 2016

all: skip slow tests in short mode
Updates golang/go#14113
Updates golang/go#11811

Change-Id: I61851de12ff474d3b738fc88f402742677973cae
Reviewed-on: https://go-review.googlesource.com/18992
Reviewed-by: Robert Griesemer <gri@golang.org>

gopherbot pushed a commit to golang/tools that referenced this issue Jan 28, 2016

all: skip slow tests in short mode
Updates golang/go#14113
Updates golang/go#11811

Change-Id: I61851de12ff474d3b738fc88f402742677973cae

@alandonovan alandonovan self-assigned this Nov 28, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.