Intelligent JSON collection caching — This library has been deprecated in favor or
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.

Build Status Code Climate Gem Version


The most expensive part of serving a JSON request is converting the serialized records into JSON. Perforated cache handles the messy task of storing and retrieving all JSON for a particular set of models as effeciently as possible. It achieves this several ways:

  1. Storing final json output, not marshalled objects
  2. Retrieving the json for as many objects at once as is possible, and then filling in the remaining json as if it was always there. This is where the term 'perforated' comes from.


Perforated is mildly configurable, mostly to allow for easy testing. However, if you are so inclined you can customize the backing cache. The default cache store is ActiveSupport::Cache::MemoryStore, which is fast but has no persistence.

Within a Rails project the simplest option is to lock with the Rails cache:

Perforated.configure do |config|
  config.cache = Rails.cache

Outside of a rails project you may wish to use something like Dalli by itself:

Perforated.configure do |config|
  config.cache ='localhost')

The standard library's JSON parser is great for most usage, and thus it is the default. However, sometimes you may want a bit more control or performance:

require 'oj'

Perforated.configure do |config|
  config.json = Oj


Wrap any collection that you want to serialize in a cache instance and then to_json on it. Not much to it!

perforated =

Any objects that have been cached will be retrieved unaltered. Any missing objects (cache misses) will be serialized, inserted back into the collection, and written into the cache.

Perforated supports reconstructing rooted objects, the likes of which can be output by ActiveModelSerializers. Serialized object collections may also have associations serialized within the same cache key. After the cached objects are fetched they will be merged together into flattened namespaces. For example, given a serialized representation like this:

{ "posts":   { "id": 1, "author_id": 1, "title": "Greatness" },
  "authors": [{ "id": 1, "name": "Myself" }] }

{ "posts":   { "id": 2, "author_id": 2, "title": "Failure" },
  "authors": [{ "id": 2, "name": "Somebody" }] }

The reconstructed and flattened represenation can be retrieved:

perforated.to_json(rooted: true) #=> {
  "posts": [
    { "id": 1, "author_id": 1, "title": "Greatness" },
    { "id": 2, "author_id": 2, "title": "Failure" }
  "authors": [
    { "id": 1, "name": "Myself" },
    { "id": 2, "name": "Somebody" }

Custom Key Strategy

The default cache key strategy is to delegate back to each object to construct its own cache key. This is useful for an object like a serializer that can implement it's own cache_key method.

class MySerializer
  attr_reader :object

  def initialize(object)
    @object = object

  def cache_key
    [object, scope]

However, if you are just serializing models or objects that don't have a custom cache method you can provide a custom key caching strategy.

module CustomStrategy
  def self.expand_cache_key(object)
    [, object.updated_at].join('/')

perforated =, CustomStrategy)


Add this line to your application's Gemfile:

gem 'perforated'

And then execute:

$ bundle

Or install it yourself as:

$ gem install perforated


  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Add some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request