Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak in v2.6.2 #5158

Closed
ChrisLahaye opened this issue Oct 19, 2022 · 5 comments · Fixed by #5159
Closed

Memory leak in v2.6.2 #5158

ChrisLahaye opened this issue Oct 19, 2022 · 5 comments · Fixed by #5159
Labels
bug 🐞 Something isn't working
Milestone

Comments

@ChrisLahaye
Copy link
Contributor

Hello,

We are observing a memory leak after upgrading from v2.6.1 to v2.6.2.

leak

The moment the memory issue starts occurring coincides with the release of v2.6.2. The drops in memory are caused by the container getting terminated after reaching the maximum allowed memory.

Dockerfile

FROM caddy:builder AS builder

RUN xcaddy build v2.6.2 \
    --with github.com/greenpau/caddy-security@v1.1.15

FROM caddy:latest

COPY --from=builder /usr/bin/caddy /usr/bin/caddy

Patch

@@ -1,6 +1,6 @@
 FROM caddy:builder AS builder

-RUN xcaddy build v2.6.1 \
+RUN xcaddy build v2.6.2 \
     --with github.com/greenpau/caddy-security@v1.1.15

 FROM caddy:latest

Initial config

{
  "admin": {
    "config": {
      "load_delay": "1s",
      "load": {
        "module": "http",
        "url": "http://caddy-config.caddy:4000/config"
      }
    }
  }
}

External config

{
  "admin": {
    "config": {
      "load_delay": "60s",
      "load": {
        "module": "http",
        "url": "http://caddy-config.caddy:4000/config"
      }
    }
  },
  "apps": {
    ...
  }
}

Logs

Logs don't show anything weird. There are only info logs about the configuration not changing (correct behavior).

v2.6.1

log-before

v2.6.2

log-after

Please let me know if you need any additional information,

Thank you,

Chris

@mholt
Copy link
Member

mholt commented Oct 19, 2022

Thanks, that's very interesting.

Can you please provide the full configs (without modifications, except you can redact private keys or passwords only please) as well as a CPU and memory profile:

Go to localhost:2019/debug/pprof (on the server) and start by collecting a goroutine dump. Then collect a CPU profile and memory/heap profile. They can take about 30s to capture. It should download a file after that delay. Then please upload them here along with the goroutine dump. Try to do this near the peak of memory use (or at least not right after a restart).

Could you also enable access/request and debug level logs if you haven't already?

And if you have time you could do a git bisect to determine which commit causes the bug? That would be extremely helpful and certainly speed up a patch.

@francislavoie
Copy link
Member

My hunch is that it has to do with #5077 but I didn't dig

@mholt
Copy link
Member

mholt commented Oct 19, 2022

@francislavoie Possibly, although now that I look at it, I think it would only leak if the connection failed or the response status was an error status:

https://github.com/caddyserver/caddy/pull/5077/files#diff-30c30eb9f50b002df80f2b88d4ecf25bf2f2b80721f414fc940bea0d644c0216R123-R128

It seems unlikely for that to be occurring all the time. I'd think the logs would mention that. (But in any case, we should probably close the response body in those error returns.)

@ChrisLahaye
Copy link
Contributor Author

The retry loop did not stop iterating on success, instead, it was creating all requests regardless of whether there is an error or not. I believe the leak is caused by the fact that the body of these requests (potentially excluding the last) was never read/closed.

I have opened a PR which fixes this issue as well as the memory leak.

mholt pushed a commit that referenced this issue Oct 24, 2022
fixes #5158

Signed-off-by: Chris Lahaye <mail@chrislahaye.com>

Signed-off-by: Chris Lahaye <mail@chrislahaye.com>
@mholt
Copy link
Member

mholt commented Oct 24, 2022

Thank you for the contribution 😀

mholt added a commit that referenced this issue Oct 24, 2022
@mholt mholt added the bug 🐞 Something isn't working label Oct 24, 2022
@mholt mholt added this to the v2.6.3 milestone Oct 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐞 Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants