Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature: optional refresh strategy (stale-while-revalidate) #533

Closed
jonathanarezki opened this issue Jul 12, 2023 · 11 comments
Closed

Feature: optional refresh strategy (stale-while-revalidate) #533

jonathanarezki opened this issue Jul 12, 2023 · 11 comments

Comments

@jonathanarezki
Copy link
Contributor

Before version 5 there was the option for a "refreshThreshold" for the any store.
The ttl of the cached key was checked against the refreshThreshold.
If the ttl was less than the refreshThreshold, the value was returned immediately, the provided function was still triggered to refresh the cache.
The benefit would be a more conservative caching strategy.

@jonathanarezki jonathanarezki changed the title Feature: optional refresh strategy (stale while revalidate) Feature: optional refresh strategy (stale-while-revalidate) Jul 12, 2023
@zzau13
Copy link
Contributor

zzau13 commented Jul 14, 2023

I think, ttl1 === tll0 - refreshThreshold.

Or maybe I'm wrong but the refresh threshold is the same time to live. Every time you use the set method or its variants you can pass the ttl as a argument.

Can you explain to me the difference between using refreshThreshold and using a smaller ttl?

@jonathanarezki
Copy link
Contributor Author

the refresh threshold allows for a new functionality.
Lets assume you define the ttl to 24h and the refresh threshold to 23h.
If any value in that store then gets checked if the remaining ttl is less than the refresh threshold.
If that is the case we still return the value directly but we run the function asynchronously and set the cache again.
That way its possible to refresh a cache before it completely runs out.

Essentialy you can reach a 100% cache coverage without complicated cron jobs.
The functionality got removed with version 5.0.0

@zzau13
Copy link
Contributor

zzau13 commented Jul 14, 2023

With this we only get necessary calls to our database, since we don't know if this value is going to be used or it is going to be invalidated but we have to make the call at that time yes or yes.
The functionality of a cache is to reduce the number of calls, not to increase it.

Can you describe a situation where it is used, because this requires more logic in the set method.

If I did another method with this functionality, would it work for you?

@jonathanarezki
Copy link
Contributor Author

I wouldn't put the logic into the set function.
It's better suited for the wrap function.

That logic is useful if the data isn't critical but called really often.
With this all the calls get returned asap even if the cache got stale, and only then get called to the db for a refresh.

Here is the PR of the old logic, that got removed with 5.0.0: #138

@zzau13
Copy link
Contributor

zzau13 commented Jul 17, 2023

I wouldn't put the logic into the set function. It's better suited for the wrap function.

wrap uses set

one use case?

@jonathanarezki
Copy link
Contributor Author

It's still not in the set function.
The wrap function is already a special case.
The refresh logic fits perfectly into the what the wrap function tries to solve.

A use case: we cash the result of a function with wrap that takes upwards of 10 seconds but the wrap function gets called a hundred times per second.
If the cache runs out we would slow down a lot.
But if we would have a stale cache, we could still return the cashed value immediately and let the function to cached be handled by a service or cloud function.

I'm not talking about a must have feature, but a pretty useful option.

@zzau13
Copy link
Contributor

zzau13 commented Jul 19, 2023

Do the PR.

But if a function takes 10 seconds it doesn't have to be in a REST API, it should be under messaging in another microservice.

@jonathanarezki
Copy link
Contributor Author

Who says the request is not handled by a message queue in another Microservice?
It's still possible that someone requests that data, that's why we use caches.

I will look into the code after my vacation.
And try to create a PR

@zzau13
Copy link
Contributor

zzau13 commented Jul 20, 2023

If the function has a time of 10 seconds, if it is a query, it would have to be reconsidered, if it is a computing function or something like that, you have to put a messaging system through queues, redis, rabbitmq, or your broker made with ntex and mttq. In the front, an SSE should be enough to receive the answer. Or I work with mttq over websocket if they are constant calls.

I have a lot of experience with realtime at low level and high scalability. My main tools are postgres and redis, node for rapid prototyping microservices, python for ai and rust for what they leave me, because nobody knows Rust anywhere, it's a trauma.
I'm always looking for work, my English is a bit ugly, but I don't block connections.

Happy holidays, if you can, don't come to Barcelona because there are already a lot of people and we don't fit. Thank you!

@zzau13
Copy link
Contributor

zzau13 commented Jul 20, 2023

https://www.reactivemanifesto.org/ I think it has to do with responsive

jonathanarezki pushed a commit to jonathanarezki/node-cache-manager that referenced this issue Sep 5, 2023
@jonathanarezki
Copy link
Contributor Author

here the pull request: #586

jonathanarezki pushed a commit to jonathanarezki/node-cache-manager that referenced this issue Sep 8, 2023
jonathanarezki pushed a commit to jonathanarezki/node-cache-manager that referenced this issue Sep 8, 2023
jonathanarezki pushed a commit to jonathanarezki/node-cache-manager that referenced this issue Sep 8, 2023
jonathanarezki pushed a commit to jonathanarezki/node-cache-manager that referenced this issue Sep 8, 2023
update: tests to be less time sensitive in slow environments
jaredwray pushed a commit that referenced this issue Sep 8, 2023
* feat: optional refresh strategy (stale-while-revalidate) #533

* feat: optional refresh strategy (stale-while-revalidate) #533
test update

* feat: optional refresh strategy (stale-while-revalidate) #533
update: README.md

* feat: optional refresh strategy (stale-while-revalidate) #533
update: README.md

* feat: optional refresh strategy (stale-while-revalidate) #533
update: tests to be less time sensitive in slow environments
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants