Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

update cache on background thread when race_condition_ttl option set #13193

Closed
wants to merge 1 commit into
from

Conversation

Projects
None yet
3 participants

ismaelga commented Dec 5, 2013

Currently the Cache#fetch method has a race_condition_ttl option which kinda fakes hit with stale data while the the cache is updated by other. But.... There will be a poor request that must wait for the cache to be updated while others are simply reading the cool stale data. So I think we could also give that request the stale data while we update the cache in the background...

I think it makes sense to do this. So just submitted this PR :)
Probably not the best solution but this is just a WIP that probably someone else could take and make it better.

btw First PR to Rails 🎉

Member

schneems commented Dec 9, 2013

I'm 👍 on this idea, not sure of the implementation. Maybe we could have this behavior in a key that makes more sense like background: true that requires you set ttl. If people are already using this option key they're expecting the current behavior.

Right now we don't have many (or any?) internal structures or practices for doing threading work inside of a request/response cycle. For example what if this gets kicked off in a unicorn worker and runs for 10 seconds. The worker returns a response in a few milliseconds then while the cache is waiting to be updated the entire worker gets killed via something like unicorn worker killer. These scenarios are non-trivial to work around and cannot be considered edge cases that can be ignored.

If this existed in Rails today I would use it all the freaking time. Setting a super low TTL and not worrying that anyone was having to get the slow down of populating the cache. In the past for really expensive per-user cached items i've done things like on every page request check to see if the cache is fresh and if not re-populate in a background worker (like resque). If this feature existed I could have used it instead.

I'm going to go try to get some others to look at this PR.

Owner

rafaelfranca commented Dec 9, 2013

I like the idea too but I think we should implement this using the same internal structure we are planning to use to send emails in background. And to make this possible we will have to wait until we have something in that area.

cc @jeremy

Member

schneems commented Dec 9, 2013

Is someone actively working on a multiple queue backend, assigned to @jeremy? I've not formally released this guy, but the implementation currently works: https://github.com/schneems/q

Hi! Are there any updates on the multiple queue backend feature to go further on my PR?

@ismaelga ismaelga closed this Sep 7, 2015

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment