Add ActiveSupport::Cache::CascadeStore #5263

Closed
wants to merge 1 commit into from

4 participants

@jch
jch commented Mar 4, 2012

A thread-safe cache store implementation that cascades operations to a
list of other cache stores. It is used to provide fallback cache
stores when primary stores become unavailable. For example, to
initialize a CascadeStore that cascades through MemCacheStore,
MemoryStore, and FileStore:

ActiveSupport::Cache.lookup_store(:cascade_store,
  :stores => [
    :mem_cache_store,
    :memory_store,
    :file_store
  ]
})

Cache operation behavior:

Read: returns first cache hit from :stores, nil if none found

Write/Delete: write/delete through to each cache store in :stores

Increment/Decrement: increment/decrement each store, returning the new
number if any stores was successfully incremented/decremented, nil
otherwise

In my app, I'm using Cascade store with MemCache as the primary store, with MemoryStore as a backup in case memcache is unavailable. What do you guys think?

@josevalim
Ruby on Rails member

/cc @jeremy

@drogus
Ruby on Rails member

Frankly I'm not sure if this is not better for a gem, seems not a common setup. On the other hand it's quite small and simple, so it will not add too much code.

@jeremy
Ruby on Rails member

@jch memcache falling back to memory seems backward. For a write-through cache you'd want to read from your lowest-latency cache first. We'd need careful bounds on first-tier cache size, too, or else you're replicating memcache in every app instance.

Cool idea and nice patch, but I think it needs further development as a plugin too 👍

@jch
jch commented Mar 4, 2012

@jeremy Thanks! I agree with your point of using the lowest latency cache first, but the reason for putting memcache before memory in our application is because we wanted the app instances to share the cache as much as possible. The memory store serves as a fallback in case memcache is temporarily unavailable and buys us time to bring memcache back online.

Options for the underlying cache stores can be specified, so the alternate setup for lower latency access with a bounded first-tier cache could look like:

ActiveSupport::Cache.lookup_store(:cascade_store,
  :stores => [
    [:memory_store, :size => 5.megabytes, :expires_in => 15.minutes],
    [:mem_cache_store, 'localhost:11211'],
  ]
})

While this feature allows developers to chain arbitrary cache stores together, there is an extra cost to writing through to all caches, so it only makes sense to have a chain 2 or 3 deep. Initially, I thought of allowing users to customize when and how deep they want to write through to the stores, but decided to keep it simple instead.

@jch
jch commented Mar 4, 2012

@drogus the main reason I wanted this in activesupport rather than in its own gem is because the main documentation for ActiveSupport::Cache::Store references the available cache stores under the cache/ directory. I wanted CascadeStore to live under that directory so that it's easy to discover when developers are researching for available cache stores. Since this store doesn't have any backing datastore behind it, it'd be harder for developers to find it even if it fits their problem.

That said, I see how it can really go either way. For example, the redis-store gem provides a cache store.

@drogus
Ruby on Rails member

@jch sure, it would be cool to have something like that easily reachable, but as @jeremy said, please prepare a plugin, so people can test it and fix any rough edges. I'm closing this, if you have any luck with creating a gem, please post it hear, I can try it on one of my apps.

@drogus drogus closed this Mar 24, 2012
@jch

@drogus thanks for the update. I prepared the gem a while back, but forgot to post it here. I've updated it with an example app and some configuration instructions. It's available at jch/activesupport-cascadestore or via rubygems. cc: @jeremy

@garethrees garethrees referenced this pull request in jch/activesupport-cascadestore Apr 23, 2013
Open

Project Status? #1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment