Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

Is it possible to do a fragment caching in a rabl view? #143

NielsKSchjoedt opened this Issue Dec 5, 2011 · 22 comments


None yet
10 participants

I'm trying to do something like this:

cache "api/v1/cars_index/#{I18n.locale}/car/#{car.cache_key}" do
attributes :id, :brand, :model_name, :fuel, :km, :year, :price
node(:color) { |car| car.color.present? ? car.color : '' }


JackCA commented Feb 9, 2012

I'm curious how to achieve this as well



nesquena commented Feb 10, 2012

This is a great idea, doing view fragment caching on RABL. I would love to figure out a clean way for caching to work in RABL and then document it in the readme. I have tried out a few methods but nothing has felt correct (i.e custom helper that renders the json myself only if a key doesn't exist for that response).

the RABL template can support a custom version of the erb cache helper that uses the read_fragment and write_fragment methods in the implementation. Though, if most scenarios cache the entire output generated by the RABL under one key an additional method can be created that is called only at the start of the RABL file -

cache @key
collection @users


nesquena commented Feb 11, 2012

Interesting, I think that could work really well. We are also planning a lot of performance improvements on more complex rabl cases. Hopefully in a few versions RABL can be much faster and support dead simple caching as well.

cmer commented Feb 19, 2012


cicloid commented Feb 22, 2012

Is anyone working on this?

I'm thinking on checking to code later today to help on this.


Would love this feature, especially if it worked with ActiveRecord's cache_key method. I had a quick look at how I could do this but wasn't sure how to do it in a way that works outside of Rails too.


nesquena commented Mar 7, 2012

Yeah that is part of what has made this harder to achieve is I primarily use this in Sinatra and Padrino apps so I would either want a solution that works across multiple frameworks or somehow have it work differently between the two in a non-confusing way.

cicloid commented Mar 8, 2012

There could be some kind of around filter for the nodes. In there you could plug whatever you want for caching... (calling directly the data store or some wrapper like Rails.cache) or just having a pluggable wrapper for whatever cache is available.

To be honest I haven't checked the issue with detail at all, so maybe my ideas are very off track.

Some sort of around filter could be a generic first step approach, then we could pass in our own logic, such as

Rails.cache.fetch "key" do

I would love to see any kind of caching, even starting simple and supporting just caching of entire pages. Performance has become a big issue using rabl


databyte commented Mar 13, 2012

I've started working on this and I believe the best way is to cache the results for each "object" such that if you simply had object @user, the key would naturally be user.cache_key and a collection would have multiple cache entries - one for each object within the collection.

I was thinking of simply doing:

Rabl.configure do |config|
  config.cache_all_output = true
  # not to be confused with cache_sources, or maybe .perform_caching = true (?)

OR per template

cache true
collection @users

@sidbatra - your idea of cache @key looks interesting which allows you to specify a different key but unfortunately that means you would store the entire results of @users under a single key. That's not something I personally want to deal with but would any of you use that method? Then you would have to worry about @users being setup in different ways at the controller level.

A cache helper using the cache block similar to what ERB won't work due to the way it seems RABL handles the results. (?) ERB uses an output_buffer and can handle streams, I don't see a similar construct in RABL. RABL just loops over the objects, builds up json, and does a big .to_s at the end.


nesquena commented Mar 13, 2012

I like the idea of it being per template or configuration globally optionally. With the default cache based on the object or collection passed in and perhaps the ability to customize the key. Very cool, I think having this will be a big win for rabl. Thanks for taking the time to think about it.

@databyte excellent analysis. I use the Rails ERB Cache helper in conjunction with named_scopes.

By using lazy loading enabled by named_scopes and a @key passed in from the controller I save the rendering time and the query time. Using @users.cache_key will cause the query to fire every time.

Your analysis is correct that cache invalidation becomes slightly more complicated.

It's ultimately your call based on what's most useful for most users of rabl. Thanks for taking my suggestion into account.

cmer commented Mar 14, 2012

I think the way Rails does partial caching is great. I always construct a key from an array of multiple values. It seems like it's the most flexible way to do caching and would love to see Rabl adopt the same model.


databyte commented Mar 15, 2012

I have the first pass of it working on https://github.com/databyte/rabl

Just implemented it on my own project and it's working nicely at the collection level or a single object. I noted the limitations so far in the README. I'll now try to get each object within a collection to cache itself such that:

collection @users  # where @users = [User.find(1), User.find(2)]
cache @users

The above will cache all the users together assuming the key is always the same. What I want is:

collection @users   # where @users = [User.find(1), User.find(42)]
cache @users

Such that even though the cache of the entire template is now invalid, the template for User.1 pulls from cache and User.42 is generated. Then if you do this:

collection @users   # where @users = [User.find(1), User.find(2), User.find(42)]
cache @users

It'll be 3 quick cache hits for each user's template and then a cache write for the collection.

In my case, I have a collection of 10 items that locally gives me this the first time:

Rendered recipes/preload.json.rabl (370.1ms)

And this with cached results:

Rendered recipes/preload.json.rabl (45.7ms)

I think making it optional would be best too since some templates are probably so simple it doesn't make sense to hit cache 10 times to rebuild the collection's output but in my case, it's more of an API and each item within the collection is a bit expensive.

rubiii commented Mar 15, 2012

@databyte just tested your implementation and it works great. for our usecase though, we would
need the base template to render cached partials instead of ignoring their cached versions though.

we have quite a lot of single objects and a show view which we would like to cache for every object.
our index view renders different show views depending on a set of filters.

so the index action is highly dynamic. what we would benefit from is caching the template instead.
do you think that's possible?

rubiii commented Mar 15, 2012

forget my last comment ;) i think this already works when you're only specifying a cache key for the partial.


databyte commented Mar 15, 2012

ok, made a pull request: #190

Includes the ability to cache per template or everything for the entire project. My results in a project are:

Ran the first time (no caching):

Rendered recipes/preload.json.rabl (2194.9ms)

Ran hitting the full template cache (params matched, object the same):

Rendered recipes/preload.json.rabl (4.0ms)

Ran hitting partial caches (params rearranged):

Rendered recipes/preload.json.rabl (85.6ms)

Then hitting refresh again for full page cache:

Rendered recipes/preload.json.rabl (3.7ms)

Obviously this is a really heavy "API call" with lots of details being dumped out. Consider it a worse case example.


nesquena commented Mar 21, 2012

Thanks to all of you for testing this and especially to @databyte for putting this together. 0.6.2 has the caching support baked in. Please try it and confirm that everything is working as expected!

@nesquena nesquena closed this Mar 21, 2012

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment