Skip to content
A simple caching module for node.js that can support any caching backend
Pull request Compare This branch is 62 commits behind serby:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.

Uber Cache - Async based caching module that can support any caching engine.

build status

Uber Cache has been built so you can have a consistent cache interface across your entire system regardless of the caching engine. All caching engines support TTL and LRU and have a aysnc/callback style interface. This means you can easily implement your own engines without changing the interface in your application.

Uber cache also comes with a synchronous interface which supports TTL and LRU but only works in memory of the current process, due to the limitation that all other engines Redis, Memcache etc require evented IO.


  npm install uber-cache


Asynchronous Interface

Most of the useful caching engines have a async interface due to the evented IO required so it is necessary to use a callback style when manipulating the cache.

var ttlInSeconds = 1
  , someData = { some: 'data' }

cache.set('some-key', someData, ttlInSeconds, function(error, cachedItem) {
  if (error) {
    // Had the error
    return false;

  console.log('Cache written key:' + cachedItem.key + ' value:' + cachedItem.value);


Uber Cache engines are decoupled from the main project. Unlike other modules that force you to install dependencies for things you not going to use, Uber Cache engines are self contained modules that you also include into your project and pass to Uber Cache on instantiation.

Currently the following engines are available:


Paul Serby follow me on twitter


Licenced under the New BSD License

Something went wrong with that request. Please try again.