Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Serve stale content from cache if origin responds with 5xx #2

Open
rrva opened this issue Dec 28, 2019 · 23 comments
Open

Serve stale content from cache if origin responds with 5xx #2

rrva opened this issue Dec 28, 2019 · 23 comments
Labels
enhancement New feature or request

Comments

@rrva
Copy link

rrva commented Dec 28, 2019

As requested in this forum thread, I want to be able to use caddy as a reverse caching proxy which makes an service more resilient to outages of the origin server, so that if the origin goes down or responds with errors, caddy serves old content.

Specifically:

  1. I only want http 200 responses to be cached
  2. If an origin server responds with http 5xx, I want to serve stale content from the cache.

Also attaching a corresponding nginx.conf which achieves what I want for reference.

nginx.conf.gz

@rrva
Copy link
Author

rrva commented Dec 28, 2019

I define stale content as content for which the cache-control max-age has already expired, but it is to be kept in the cache anyway and served as a fallback if the origin is unresponsive/responds with errors

@francislavoie
Copy link
Member

@mholt I think this issue should be moved to https://github.com/caddyserver/cache-handler

@mholt mholt transferred this issue from caddyserver/caddy Apr 24, 2020
@mholt
Copy link
Member

mholt commented Apr 24, 2020

@francislavoie Agreed -- and this feature should be pretty easy to implement. Caddy has a ResponseRecorder which can help the cache decide whether to respond or not.

@mholt mholt added the enhancement New feature or request label Apr 24, 2020
@gc-ss
Copy link

gc-ss commented Mar 28, 2022

@rrva are you still interested in this and/or willing to test an implementation?

@rrva
Copy link
Author

rrva commented Mar 29, 2022

@gc-ss Sorry, no interest has faded for the moment (not using caddy in this capacity right now).

@frederichoule
Copy link
Contributor

Maybe we can take into count stale-if-error from Cache-Status - https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control

@frederichoule
Copy link
Contributor

Should this issue be moved to https://github.com/darkweak/souin ?

@darkweak
Copy link
Collaborator

darkweak commented Apr 1, 2023

In the Souin codebase we support the stale-if-error directive

@fliespl
Copy link

fliespl commented Jan 13, 2024

@darkweak do you think it might be possible to make it configurable and not depending on upstream response cache-control?

I would like to enable fallback to server STALE page in case my upstream goes down.

@darkweak
Copy link
Collaborator

@fliespl you may use the mode bypass_response to bypass the response RFC checks. Or you could use the header_down caddy file directive to remove the cache-control response header.

@fliespl
Copy link

fliespl commented Jan 16, 2024

@darkweak thanks - do you have any example of that? I have tried a few things without success...

@darkweak
Copy link
Collaborator

I'm not sure but it could be something like this (cc @francislavoie about header_down reverse_proxy):

route {
  cache
  reverse_proxy 127.0.0.1:81 {
    header_down -Cache-Control
  }
}

Or using the mode cache directive:

route {
  cache {
    mode bypass_response
  }
  reverse_proxy 127.0.0.1:81
}

@fliespl
Copy link

fliespl commented Jan 21, 2024

@darkweak actually, I was not able to use this. Steps tried:

Simple test.php file with:

Cache-Control:
max-age=10, stale-if-error=120

I get cached on first request, disable php upstream. Cache works for 10 seconds and after that I get bad gateway 502 error and not stale content.

Souin; fwd=uri-miss; key=GET-https-example.com-/test.php; detail=SERVE-HTTP-ERROR

STALE_GET-https-example.com-/test.php still has ttl of 3471 seconds.

Am I missing something?

@darkweak
Copy link
Collaborator

@fliespl what's your request? (curl)

@fliespl
Copy link

fliespl commented Jan 21, 2024

curl https://domain
or
curl -I https://domain

No matter if it's GET / HEAD same output.

@darkweak
Copy link
Collaborator

If your request doesn't send a stale-if-error directive the HTTP cache won't serve the stale response because the client doesn't allow it as explained in the RFC 7234. But reading the "new" 9111 RFC, they consider stale to be allowed by default. Maybe I misunderstood something in these RFCs.

@fliespl
Copy link

fliespl commented Jan 21, 2024

@darkweak isn't stale-if-error only response header? I haven't seen any browser utilizing it for "request".

I would assume it's origin response to cache servers that instructs them that in case they fail, cache server is allowed to serve this content to client.

@darkweak
Copy link
Collaborator

I meant use the max-stale request directive sorry

@fliespl
Copy link

fliespl commented Jan 21, 2024

@darkweak okay so my initial request still stands - allow serving stale content no matter what request sends (since max-stale is not used). Maybe it should allow it if mode is bypass request the same way it does for no-cache?

@darkweak
Copy link
Collaborator

Yes, I will create a PR for that! 👍

@fliespl
Copy link

fliespl commented Jan 21, 2024

@darkweak also found one other issue which is probably connected.

At default setup (without souin) if reverse proxy (php fastcgi) returns status 500 with html it gets displayed (i.e. wordpress error: https://ss.codeone.pl/ss-2024-01-21-22-20-19-1705872019-F83puPwH.png)

With souin enabled I get empty response and default chrome error: https://ss.codeone.pl/ss-2024-01-21-22-21-00-1705872060-HGzbuNTQ.png I feel like it's a bug somewhere with souin altering proxy response/headers.

@darkweak
Copy link
Collaborator

@fliespl there is a PR already open for that darkweak/souin#441 (thanks to @vejipe) 😄

@fliespl
Copy link

fliespl commented Jan 21, 2024

@darkweak @vejipe Didn't realize it's connected - thanks much! :) Great job by the way!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants