Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory not being reclaimed by OS #41444

Open
Pungyeon opened this issue Sep 17, 2020 · 5 comments
Open

Memory not being reclaimed by OS #41444

Pungyeon opened this issue Sep 17, 2020 · 5 comments

Comments

@Pungyeon
Copy link

@Pungyeon Pungyeon commented Sep 17, 2020

What version of Go are you using (go version)?

We have reproduced this behaviour with 1.13, 1.14 & 1.15

$ go version
go version go1.15.2 linux/amd64

Does this issue reproduce with the latest release?

Yes.

What operating system and processor architecture are you using (go env)?

We have tried several different UNIX distributions, mainly ubuntu and alpine. However, they all behave alike.

go env Output
$ go env
GO111MODULE=""
GOARCH="amd64"
GOBIN=""
GOCACHE="/root/.cache/go-build"
GOENV="/root/.config/go/env"
GOEXE=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/go/pkg/mod"
GONOPROXY=""
GONOSUMDB=""
GOOS="linux"
GOPATH="/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/usr/local/go"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/usr/local/go/pkg/tool/linux_amd64"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD=""
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build014469751=/tmp/go-build -gno-record-gcc-switches"

What did you do?

We have written an application for transforming VoIP traffic from the network, using the gopacket library. The application parses VoIP streams and decodes the associated RTP traffic. All data associated with ongoing calls is stored in memory and all completed calls are stored on disk. However, when sending traffic through our application, we have noticed that memory is never reclaimed by the OS. We initially thought that this was a memory leak but after extensive investigation concluded that it is not.

We stumbled upon the following article, which described similar behaviour: https://blog.detectify.com/2019/09/05/how-we-tracked-down-a-memory-leak-in-one-of-our-go-microservices/

We tried implementing a similar fix, and it actually seemed to work. Using the environment variable GODEBUG=madvdontneed=1 together with running debug.FreeOSMemory enabled the OS to reclaim the memory and avoiding an eventual OOM error.

I unfortunately cannot share the code at this current time, since it is still proprietary software 😢 But I may be able to share a video demonstration of this exact issue, if this would help in any way.

What did you expect to see?

Once a call session is over, all data is cleared from memory. We expected to see the OS reclaiming this memory.

What did you see instead?

The OS never reclaims this memory and we see a steady increase in memory usage of our application (without the fix described above).

@davecheney
Copy link
Contributor

@davecheney davecheney commented Sep 17, 2020

The OS never reclaims this memory and we see a steady increase in memory usage of our application (without the fix described above).

Can you run your application with GODEBUG=gctrace=1 up to the point it is killed by the operating system and paste the complete output in this issue. Thank you.

@Pungyeon
Copy link
Author

@Pungyeon Pungyeon commented Sep 17, 2020

Thank you so much @davecheney for the extremely quick response. I will return with this information as soon as possible 🙏

@Pungyeon
Copy link
Author

@Pungyeon Pungyeon commented Sep 18, 2020

Here is a link with two different traces, I hope the names are sufficiently descriptive :) https://lmjcorti-my.sharepoint.com/:f:/g/personal/infrastructure_corti_ai/EtXO0HhmtIdBhrusGcjdkhQB4INvplcJyR50eR9NGP9iyg?e=KNPC68

I have run this in the official golang:1.15 docker container, with 2GB of memory allocated. Since the leak is rather minor I didn't wait until a OOM, but I hope that the logs are still useful. If not, let me know and I will rerun the tests again and produce new output.

Thank you again.

@davecheney
Copy link
Contributor

@davecheney davecheney commented Sep 18, 2020

Thank you for this data, can I ask you to use go build, not go run as the trace for go run is conflated with the trace for you application.

@davecheney
Copy link
Contributor

@davecheney davecheney commented Sep 18, 2020

Having said that, it looks like your application's heap is around 25 based on this line

gc 11 @600.379s 0%: 0.062+13+0.025 ms clock, 0.12+0/6.4/14+0.051 ms cpu, 13->13->12 MB, 25 MB goal, 2

Does this match with your measurements?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
2 participants
You can’t perform that action at this time.