Skip to content
HTTP Cache Middleware
JavaScript
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github
demos adding memory and redis demos Jul 19, 2019
test support the case of no TTL when Cache-Control is present Jul 15, 2019
.gitignore Initial commit May 28, 2019
.travis.yml adding tests May 28, 2019
LICENSE Initial commit May 28, 2019
README.md if generated, ETags now use a random value to keep clean cache states Jul 9, 2019
get-keys.js
index.js double check IF_NONE_MATCH is set before checking etags Jul 16, 2019
package-lock.json v1.2.3 Jul 16, 2019
package.json v1.2.3 Jul 16, 2019

README.md

http-cache-middleware

High performance connect-like HTTP cache middleware for Node.js. So your latency can decrease to single digit milliseconds 🚀

Uses cache-manager as caching layer, so multiple storage engines are supported, i.e: Memory, Redis, ... https://www.npmjs.com/package/cache-manager

Install

npm i http-cache-middleware

Usage

const middleware = require('http-cache-middleware')()
const service = require('restana')()
service.use(middleware)

service.get('/cache-on-get', (req, res) => {
  setTimeout(() => {
    // keep response in cache for 1 minute if not expired before
    res.setHeader('x-cache-timeout', '1 minute')
    res.send('this supposed to be a cacheable response')
  }, 50)
})

service.delete('/cache', (req, res) => {
  // ... the logic here changes the cache state

  // expire the cache keys using pattern
  res.setHeader('x-cache-expire', '*/cache-on-get')
  res.end()
})

service.start(3000)

Redis cache

// redis setup
const CacheManager = require('cache-manager')
const redisStore = require('cache-manager-ioredis')
const redisCache = CacheManager.caching({
  store: redisStore,
  db: 0,
  host: 'localhost',
  port: 6379,
  ttl: 30
})

// middleware instance
const middleware = require('http-cache-middleware')({
  stores: [redisCache]
})

Why cache?

Because caching is the last mile for low latency distributed systems!

Enabling proper caching strategies will drastically reduce the latency of your system, as it reduces network round-trips, database calls and CPU processing.
For our services, we are talking here about improvements in response times from X ms to ~2ms, as an example.

Enabling cache for service endpoints

Enabling a response to be cached just requires the x-cache-timeout header to be set:

res.setHeader('x-cache-timeout', '1 hour')

Here we use the ms package to convert timeout to seconds. Please note that millisecond unit is not supported!

Example on service using restana:

service.get('/numbers', (req, res) => {
  res.setHeader('x-cache-timeout', '1 hour')

  res.send([
    1, 2, 3
  ])
})

Caching on the browser side (304 status codes)

From version 1.2.x you can also use the HTTP compatible Cache-Control header: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control When using the Cache-Control header, you can omit the custom x-cache-timeout header as the timeout can be passed using the max-age directive.

Direct usage:

res.setHeader('cache-control', 'private, no-cache, max-age=300')
res.setHeader('etag', 'cvbonrw6g00')

res.end('5 minutes cacheable content here....')

Indirect usage:

When using:

res.setHeader('x-cache-timeout', '5 minutes')

The middleware will now transparently generate default Cache-Control and ETag headers as described below:

res.setHeader('cache-control', 'private, no-cache, max-age=300')
res.setHeader('etag', 'ao8onrw6gbt') // random ETag value 

This will enable browser clients to keep a copy of the cache on their side, but still being forced to validate the cache state on the server before using the cached response, therefore supporting gateway based cache invalidation.

NOTE: In order to fetch the generated Cache-Control and ETag headers, there have to be at least one cache hit.

Invalidating caches

Services can easily expire cache entries on demand, i.e: when the data state changes. Here we use the x-cache-expire header to indicate the cache entries to expire using a matching pattern:

res.setHeader('x-cache-expire', '*/numbers')

Here we use the matcher package for matching patterns evaluation.

Example on service using restana:

service.patch('/numbers', (req, res) => {
  // ...

  res.setHeader('x-cache-expire', '*/numbers')
  res.send(200)
})

Invalidating multiple patterns

Sometimes is required to expire cache entries using multiple patterns, that is also possible using the , separator:

res.setHeader('x-cache-expire', '*/pattern1,*/pattern2')

Custom cache keys

Cache keys are generated using: req.method + req.url, however, for indexing/segmenting requirements it makes sense to allow cache keys extensions.

For doing this, we simply recommend using middlewares to extend the keys before caching checks happen:

service.use((req, res, next) => {
  req.cacheAppendKey = (req) => req.user.id // here cache key will be: req.method + req.url + req.user.id  
  return next()
})

In this example we also distinguish cache entries by user.id, very important for authorization reasons.

Disable cache for custom endpoints

You can also disable cache checks for certain requests programmatically:

service.use((req, res, next) => {
  req.cacheDisabled = true
  return next()
})

Want to contribute?

This is your repo ;)

Note: We aim to be 100% code coverage, please consider it on your pull requests.

Related projects

You can’t perform that action at this time.