Server-side response caching for hapi.
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.


Logo design by

Server-side response caching plugin for hapi

Current Version Build Status belly-button-style

tacky adds a new handler named cache that can be used on any route that is a GET method. tacky will try to serve a value from the server cache first if present. If the value is not in the server cache, it will call hydrate(), reply with the result and then cache the value in the server cache for subsequent requests. tacky stores values in a hapi server cache provision. It does not just set the response cache headers.


See the API Reference


copied from examples/default.js

const Assert = require('assert');
const Http = require('http');
const Hapi = require('hapi');
const Tacky = require('tacky');

const server = new Hapi.Server();
server.connection({ port: 9001 });

server.register({ register: Tacky }, (err) => {
    method: 'get',
    path: '/',
    config: {
      handler: {
        cache: {
          hydrate: (request, callback) => {
            Http.get('', (res) => {
              const buffers = [];
              res.on('data', (chunk) => {
              res.on('end', () => {
                callback(null, buffers.join().toString());
  server.start(() => { console.log('Server started at ' +; });

When the first request comes in to "/", the hydrate method is called. We are getting the Google home page and after 1000 milliseconds, we are calling back with the result. If you make a second request to "/", you should notice the delay isn't there and the response is almost instantaneous. The original response has been cached and sent back to the client. If you are testing with a browser, you should notice that the cache header decrements on each request. tacky will set the client cache header ("cache-control:max-age=3566, must-revalidate") based on the ttl options. The cache header ttl will be randomized so that the server isn't slammed by multiple requests at the same time. The goal is to stagger the cache header expiration across different clients.