-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Have the default fetcher implementation respect caching headers #35
Comments
That is certainly a valid scenario for myself. I'd be very grateful if you open a PR on that :) Just wanted to have a more in depth description/discussion about it. Which response headers would you consider here? I think that this topic is "interpreted" very differently (unfortunately) by different servers. I mean, there are (according to MDN) a lot of possible response headers here and it seems like we don't have a standard cache middleware for Tesla, meaning, we would assume the ownership of implementing HTTP caching here. I think, maybe, a better way would be to implement a cache Tesla middleware and then just use it here. It could be its own library and by doing that we wouldn't couple its development with Joken JWKS. Wdyt? |
I don't see a Tesla middleware as being particularly useful here, because what I'm proposing isn't a browser cache, but rather pre-emptive re-requesting. The only way a Tesla middleware would help is if joken_jwks was modified to hit the JWKS URL (synchronously) on every request, parse the JWKS out, and use them to validate the token. The reason for that is that a Tesla caching middleware would know that a resource is stale, but it (presumably) wouldn't be written to pre-emptively fetch an expiring resource and somehow feed it through to joken_jwks. As for which headers to examine, we could start with what is, I think, the most common one -- |
I thought about a middleware with callbacks to when a cache entry was expired. That would make it easy for us to re-trigger requesting the JWKS and so making it more "generic". We can try to interpret I think there will be plenty of scenarios of "interpreting" these directives and headers for caching. I am not against the idea, just saying it is likely muddy waters for Joken itself. Either way, I would not reject to having the functionality though! I think that under a configuration option (and not the default) that would be cool to have! If someone implements a cache middleware for Tesla with a callback we could then use it here and keep the same configuration options. If you want to take a shot I'd be really glad to review it. Documentation will be key here too! |
As an example of the burdens of HTTP cache, here is a library that is not even complete but mentions a couple of issues: https://github.com/tanguilp/http_cache I really think this is up to the HTTP layer and not joken jwks for now. This may change but I will close this for now |
Having the only trigger for a refresh of the JWKS be the presentation of a token with an invalid kid is a neat approach, but suffers from the risk that valid tokens will be rejected for some period of time by the resource server.
Instead, it would be handy if the HTTP response headers that control caching behaviour were examined and used as a means of determining a schedule for periodic pre-emptive refreshes of the JWKS. This would allow IdPs to periodically rotate keys, without the risk that a JokenJwks-using resource server would reject valid tokens, as long as they introduce the new key into the JWKS at least one expiry time period before first using it.
Of course, the existing "refresh on unknown kid" behaviour could also stay in place, as a fallback in the event that an IdP makes a mistake with their caching/rotation scheduling, and the "only retry every N milliseconds" prevents an absurdly short expiry period from causing a DoS.
If this is a feature that you'd be willing to accept in the default fetcher, I'd be happy to work up a PR. I'm pretty confident I can make the whole thing backwards-compatible, and could even make the cache-respecting behaviour be opt-in via an option if that would make the feature more palatable.
The text was updated successfully, but these errors were encountered: