Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

net/http: http.Transport leaks context from the request #50798

Open
juliens opened this issue Jan 25, 2022 · 7 comments · May be fixed by #50799
Open

net/http: http.Transport leaks context from the request #50798

juliens opened this issue Jan 25, 2022 · 7 comments · May be fixed by #50799
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Milestone

Comments

@juliens
Copy link
Contributor

juliens commented Jan 25, 2022

What version of Go are you using (go version)?

$ go version
go version go1.17 linux/amd64

Does this issue reproduce with the latest release?

Yes

What operating system and processor architecture are you using (go env)?

go env Output
$ go env
GO111MODULE="on"
GOARCH="amd64"
GOBIN=""
GOCACHE="/home/juliens/.cache/go-build"
GOENV="/home/juliens/.config/go/env"
GOEXE=""
GOEXPERIMENT=""
GOFLAGS=""
GOHOSTARCH="amd64"
GOHOSTOS="linux"
GOINSECURE=""
GOMODCACHE="/home/juliens/dev/go/pkg/mod"
GONOPROXY=""
GONOSUMDB=""
GOOS="linux"
GOPATH="/home/juliens/dev/go"
GOPRIVATE=""
GOPROXY="https://proxy.golang.org,direct"
GOROOT="/home/juliens/.gvm/gos/go1.17"
GOSUMDB="sum.golang.org"
GOTMPDIR=""
GOTOOLDIR="/home/juliens/.gvm/gos/go1.17/pkg/tool/linux_amd64"
GOVCS=""
GOVERSION="go1.17"
GCCGO="gccgo"
AR="ar"
CC="gcc"
CXX="g++"
CGO_ENABLED="1"
GOMOD="/dev/null"
CGO_CFLAGS="-g -O2"
CGO_CPPFLAGS=""
CGO_CXXFLAGS="-g -O2"
CGO_FFLAGS="-g -O2"
CGO_LDFLAGS="-g -O2"
PKG_CONFIG="pkg-config"
GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build3799944390=/tmp/go-build -gno-record-gcc-switches"

What did you do?

I use the http.Transport to make an HTTP request. I add a not empty struct in the request context.

You can reproduce it like this

What did you expect to see?

After the end of the request, the struct and the context are cleaned by the GC.

What did you see instead?

After the end of the request, the struct and the context are not cleaned by the GC.

Maybe this is the same issue as #43966

@juliens juliens linked a pull request Jan 25, 2022 that will close this issue
@gopherbot
Copy link

Change https://golang.org/cl/380674 mentions this issue: net/http: fix memory leak in http.Transport

@mknyszek
Copy link
Contributor

CC @neild

@mknyszek mknyszek added the NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one. label Jan 25, 2022
@mknyszek mknyszek added this to the Backlog milestone Jan 25, 2022
@netrebel
Copy link

We seem to be having the same problem. In long running instances, memory utilization is going up every day reaching 80-90% in ~12 days. We started seeing this issue when we upgraded from Go 1.11.3 (no modules) to 1.17 (with go-modules), before the Memory would stay under 30%. Servers have 4GBs.

When I run pprof I see something similar to the comment in the ticket referenced:

      flat  flat%   sum%        cum   cum%
    1.50MB  5.07%  5.07%    12.04MB 40.70%  net/http.(*Transport).dialConn
         0     0%  5.07%    12.04MB 40.70%  net/http.(*Transport).dialConnFor
    8.03MB 27.14% 32.21%     8.03MB 27.14%  bufio.NewWriterSize (inline)
    6.02MB 20.36% 52.57%     6.02MB 20.36%  bufio.NewReaderSize (inline)

@luca147
Copy link

luca147 commented Dec 16, 2022

Same issue, there is some news about this?

@netrebel
Copy link

We seem to be having the same problem. In long running instances, memory utilization is going up every day reaching 80-90% in ~12 days. We started seeing this issue when we upgraded from Go 1.11.3 (no modules) to 1.17 (with go-modules), before the Memory would stay under 30%. Servers have 4GBs.

When I run pprof I see something similar to the comment in the ticket referenced:

      flat  flat%   sum%        cum   cum%
    1.50MB  5.07%  5.07%    12.04MB 40.70%  net/http.(*Transport).dialConn
         0     0%  5.07%    12.04MB 40.70%  net/http.(*Transport).dialConnFor
    8.03MB 27.14% 32.21%     8.03MB 27.14%  bufio.NewWriterSize (inline)
    6.02MB 20.36% 52.57%     6.02MB 20.36%  bufio.NewReaderSize (inline)

In case this helps someone, this other ticket was closed with a comment explaining how to avoid this issue: newrelic/go-agent#447 (comment)

@luca147
Copy link

luca147 commented Dec 17, 2022

Well I spotted in my case the solution, I have a global scope context that are aware when a term signal come and all of my workers need to be stopped when it comes. So every time a job start I create a new context with cancel and defer it. With this simple change the leak over bufio.NewWriterSize/bufio.NewReaderSize into transport disappear.

@urenner-n2i
Copy link

Any update on this issue?
When connecting to new hosts very often, the memory consumption increases very quickly, and never decreases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
NeedsInvestigation Someone must examine and confirm this is a valid issue and not a duplicate of an existing one.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants