Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

x/tools/gopls: memory usage problems have returned in gopls 0.5 #41890

Open
trapgate opened this issue Oct 9, 2020 · 4 comments
Open

x/tools/gopls: memory usage problems have returned in gopls 0.5 #41890

trapgate opened this issue Oct 9, 2020 · 4 comments

Comments

@trapgate
Copy link

@trapgate trapgate commented Oct 9, 2020

What version of Go are you using (go version)?

$ go version
go version go1.15.2 linux/amd64

Does this issue reproduce with the latest release?

Yes

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE="off"
GOARCH="amd64"
GOBIN=""
GOCACHE="/home/geoff/.cache/go-build"
GOENV="/home/geoff/.config/go/env"
GOEXE=""
GOFLAGS="-mod=vendor"
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/home/geoff/do/cthulhu/docode/pkg/mod"
GONOPROXY="*.internal.digitalocean.com,github.com/digitalocean"
GONOSUMDB="*.internal.digitalocean.com,github.com/digitalocean"
GOOS="linux"
GOPATH="/home/geoff/do/cthulhu/docode"
GOPRIVATE="*.internal.digitalocean.com,github.com/digitalocean"
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/home/geoff/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/home/geoff/go/pkg/tool/linux_amd64"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD=""
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build528678853=/tmp/go-build -gno-record-gcc-switches"

What did you do?

Started vscode.

What did you expect to see?

After experiencing lower memory usage with our monorepo and gopls 0.4.4, with version 0.5 gopls is back to using 10GB of memory. This was previously reported in this issue.

What did you see instead?

Instead of using 800MB - 1GB of memory when running against our monorepo, gopls is back to using about 10GB.

gopls.197817-9GiB-nonames.zip

@gopherbot gopherbot added the gopls label Oct 9, 2020
@gopherbot gopherbot added this to the Unreleased milestone Oct 9, 2020
@stamblerre stamblerre changed the title x/Tools/gopls: Memory usage problems have returned in gopls 0.5 x/tools/gopls: memory usage problems have returned in gopls 0.5 Oct 9, 2020
@gopherbot gopherbot added the Tools label Oct 9, 2020
@stamblerre
Copy link
Contributor

@stamblerre stamblerre commented Oct 9, 2020

Is this with gopls/v0.5.1? Also, can you please share your gopls settings?

/cc @heschik

@stamblerre stamblerre modified the milestones: Unreleased, gopls/v0.5.2 Oct 9, 2020
@heschik
Copy link
Contributor

@heschik heschik commented Oct 9, 2020

This is troubling. It certainly looks like the return of the memory leak issues. But the mechanism must be completely different, since the cache is structured completely differently. I'll try to find some time to re-run the memory tests and maybe add some diagnostic pages that will be more helpful.

In the meantime, it might be interesting to see a series of zip files, rather than just the biggest one. In particular I'm interested in whether the list of loaded packages is changing/growing over time.

@trapgate
Copy link
Author

@trapgate trapgate commented Oct 9, 2020

I first noticed it after upgrading to gopls 0.5.0, but I'm running 0.5.1 now and it's still happening. It's not a gradual increase, either - I just reloaded gopls, and within about 30s the new process reached 7.1GB. It seems to increase gradually after that, but it's not unbounded - it was using 10GB after running for a couple of weeks.

For a gopls launched with: -mode=stdio serve -rpc.trace --debug=localhost:6060, here's a set of zip files.

gopls.241070-1GiB-nonames.zip
gopls.241070-2GiB-nonames.zip
gopls.241070-3GiB-nonames.zip
gopls.241070-4GiB-nonames.zip
gopls.241070-5GiB-nonames.zip
gopls.241070-6GiB-nonames.zip

@heschik
Copy link
Contributor

@heschik heschik commented Oct 13, 2020

I was able to spend a little time today running through the tests I used to guide the optimizations that landed in 0.4.4. They all look good, which is nice but also completely unhelpful. I'm not aware of any intended changes that would have increased memory usage.

The behavior you're describing sounds like increased memory usage during the initial workspace load, where we discover, type-check, and diagnose all the packages in the workspace. The zips you attached above are consistent with that. Given that my tests look the same, the most obvious explanation is that the IWL is finding and loading more packages than it used to for some reason. Comparing the contents of http://localhost:6060/cache/1 for 0.4.4 vs. 0.5.1 might show interesting results. Unfortunately it's very hard to figure out what's going on with the redacted logs. Can you compare them yourself, or maybe you would be willing to send some to me directly?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.