Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caddy with cache-handler crashes frequently on virtuozzo/jelastic #62

Closed
genox opened this issue Sep 6, 2023 · 9 comments · Fixed by darkweak/souin#366
Closed

Caddy with cache-handler crashes frequently on virtuozzo/jelastic #62

genox opened this issue Sep 6, 2023 · 9 comments · Fixed by darkweak/souin#366

Comments

@genox
Copy link

genox commented Sep 6, 2023

Hi,

I have an issue with caddy terminating after a while when running on Virtuozzo/Jelastic. It is a bit difficult to debug this issue since logs are not easy to come by. But maybe there are known issues?

Basically, the behaviour is like this: Caddy runs just fine without the cache activated in the Caddyfile. Once cache is activated, I can hammer it with requests and it is stable, however, after a period of about 24 hours Caddy terminates and since virtuozzo doesn't offer a restart policy for its implementation of running custom docker containers, it has to be manually restarted. Unfortunately I am bound to use Jelastic here with this setup.

The Caddyfile is quite simple:

# Global directives
{
	order cache before rewrite
	cache
}

# Domain directives
domain.xx {
	cache
	reverse_proxy <IP>:3000
}

The instance has 2GB of memory assigned and doesn't report any OOM issues. Load is minimal, even when running load-tests. Caddy is caching requests to a NextJS instance that serves certain responses with s-maxage/stale-while-revalidate headers and this works fine, too.

So my first question is, have there been reports of crashes for this environment or nextJS in particular?

I will try to get my hands on a log file to post here.

@genox
Copy link
Author

genox commented Sep 8, 2023

Here is a log entry of Caddy crashing with cache-handler enabled. The instance was hovering at about 50% memory usage at the time of the crash and CPU was below 5%

panic: Header called after Handler finished

goroutine 2925405 [running]:
golang.org/x/net/http2.(*responseWriter).Header(0x7c3a3c?)
	golang.org/x/net@v0.14.0/http2/server.go:2816 +0x6d
github.com/darkweak/souin/pkg/middleware.(*CustomWriter).Header(0x24eee00?)
	github.com/darkweak/souin@v1.6.40/pkg/middleware/writer.go:50 +0xb8
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).Upstream(0xc000036140, 0xc0003bf440, 0xc0114bd400, 0xc00038f000, 0x44551c?, {0xc00f279cc0, 0x36})
	github.com/darkweak/souin@v1.6.40/pkg/middleware/middleware.go:281 +0x252
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP.func3()
	github.com/darkweak/souin@v1.6.40/pkg/middleware/middleware.go:513 +0x38
created by github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP in goroutine 2925165
	github.com/darkweak/souin@v1.6.40/pkg/middleware/middleware.go:512 +0x1b52

@adammakowskidev
Copy link

Hi
I have the same problem, very high RAM consumption.
#63

@genox genox changed the title Caddy with cache-handler on virtuozzo/jelastic Caddy with cache-handler crashes frequently on virtuozzo/jelastic Sep 8, 2023
@genox
Copy link
Author

genox commented Sep 11, 2023

After monitoring the instance for a couple of days and allocating 4x the memory than previously to have more headroom, I can report that memory usage continuously grows without ever dropping. I would expect memory consumption to reside on a stable level after most of the cacheable items are in cache but not a continuous increase.

I assume that the crash is OOM-related even though jelastic doesn't show any evidence, but jelastic stats and charts are notoriously unhelpful. With more resources assigned, the trend is that all resources are being used, which leads me to believe that there is a memory leak.

@darkweak
Copy link
Collaborator

Hey @genox, I'm currently debugging that.

@genox
Copy link
Author

genox commented Sep 25, 2023

@darkweak thank you. happy to test a new release and provide further feedback.

@darkweak
Copy link
Collaborator

@genox can you retry with xcaddy build --with github.com/darkweak/souin/plugins/caddy@541dee72c726a85a5dba3b5a6a02127c4351beff --with github.com/darkweak/souin@541dee72c726a85a5dba3b5a6a02127c4351beff please?

@genox
Copy link
Author

genox commented Oct 16, 2023

@darkweak Thanks - I built and deployed this version, I will keep an eye on mem consumption for the next few days.

@genox
Copy link
Author

genox commented Oct 18, 2023

@darkweak Caddy crashed after about 2 days uptime:

panic: Header called after Handler finished

runtime/debug.Stack()
	runtime/debug/stack.go:24 +0x5e
golang.org/x/sync/singleflight.newPanicError({0x1cae3a0?, 0x25e2fb0})
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:44 +0x26
golang.org/x/sync/singleflight.(*Group).doCall.func2.1()
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:193 +0x34
panic({0x1cae3a0?, 0x25e2fb0?})
	runtime/panic.go:914 +0x21f
golang.org/x/net/http2.(*responseWriter).Header(0xc01711a000?)
	golang.org/x/net@v0.17.0/http2/server.go:2870 +0x6d
github.com/darkweak/souin/pkg/middleware.(*CustomWriter).Header(0x25f9f20?)
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/writer.go:50 +0xb8
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).Revalidate.func1()
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:356 +0xa5
golang.org/x/sync/singleflight.(*Group).doCall.func2(0xc01627bc76, 0xc015898d20, 0x7f48e62fca68?)
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:198 +0x64
golang.org/x/sync/singleflight.(*Group).doCall(0x1d712a0?, 0xc0000bd1a0?, {0xc0004a04b0?, 0x27?}, 0x40e65a?)
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:200 +0x96
golang.org/x/sync/singleflight.(*Group).Do(0xc0002e2138, {0xc0004a04b0, 0x27}, 0x22?)
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:113 +0x15a
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).Revalidate(0xc0002e2000, 0xc018a72a20, 0xc000330620, 0xc0002d8b40, 0xc0166dca00, 0xc00021b9b0, {0xc0004a04b0, 0x27})
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:353 +0x257
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP.func2(0xc0175f7fd0?, 0x4455bc?, 0xc0175f7fd0?, 0x16946b3?, 0xc018d07a20?, {0xc0004a04b0?, 0xc018d079a0?})
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:550 +0x34
created by github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP in goroutine 1516552
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:549 +0x2090


goroutine 1515936 [running]:
golang.org/x/sync/singleflight.(*Group).doCall.func1()
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:170 +0x2b3
golang.org/x/sync/singleflight.(*Group).doCall(0x1d712a0?, 0xc0000bd1a0?, {0xc0004a04b0?, 0x27?}, 0x40e65a?)
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:205 +0xb1
golang.org/x/sync/singleflight.(*Group).Do(0xc0002e2138, {0xc0004a04b0, 0x27}, 0x22?)
	golang.org/x/sync@v0.4.0/singleflight/singleflight.go:113 +0x15a
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).Revalidate(0xc0002e2000, 0xc018a72a20, 0xc000330620, 0xc0002d8b40, 0xc0166dca00, 0xc00021b9b0, {0xc0004a04b0, 0x27})
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:353 +0x257
github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP.func2(0xc0175f7fd0?, 0x4455bc?, 0xc0175f7fd0?, 0x16946b3?, 0xc018d07a20?, {0xc0004a04b0?, 0xc018d079a0?})
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:550 +0x34
created by github.com/darkweak/souin/pkg/middleware.(*SouinBaseHandler).ServeHTTP in goroutine 1516552
	github.com/darkweak/souin@v1.6.43-0.20231012024144-541dee72c726/pkg/middleware/middleware.go:549 +0x2090

Memory was not starting to bloat yet, though.

@darkweak
Copy link
Collaborator

It tries to update headers while request is done, I'm working on a fix for that.
Good to see memory wasn't growing too much.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants