Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

purgeStale skips purge due to expiration ceil (assumption) making cache use stale values #7

Closed
itsuryev opened this issue Jun 29, 2022 · 2 comments

Comments

@itsuryev
Copy link

I am trying to set up API rate limiter using ttlcache:

const requestsCountByKeyCache = new TTLCache({
  ttl: thresholdPeriodInSeconds * 1000,
  noUpdateTTL: true,
});

The idea is that requests keep coming in, and I count them using some key, e.g. API_TOKEN_xxxx. I do not refresh TTL and let them expire, value will reset automatically.

And yet I am hitting rate limit errors. I've set limit to 3 and sending 2 request per second, here is the log:

2022-06-29T06:04:41.811Z API_TOKEN_my.app 1
2022-06-29T06:04:41.812Z API_TOKEN_my.app 2
2022-06-29T06:04:42.812Z purgeStale
2022-06-29T06:04:42.812Z purge done, current value: 2 new data Map(0) {}
2022-06-29T06:04:42.815Z API_TOKEN_my.app 1
2022-06-29T06:04:42.817Z API_TOKEN_my.app 2
2022-06-29T06:04:43.809Z API_TOKEN_my.app 3
2022-06-29T06:04:43.811Z API_TOKEN_my.app 4
2022-06-29T06:04:43.814Z purgeStale
2022-06-29T06:04:43.814Z purgeStale, exp > n 8470 8468.831500053406
2022-06-29T06:04:44.809Z API_TOKEN_my.app 5
2022-06-29T06:04:44.810Z API_TOKEN_my.app 6
2022-06-29T06:04:45.810Z purgeStale
2022-06-29T06:04:45.811Z purge done, current value: 6 new data Map(0) {}

purgeStale, exp > n 8470 8468.831500053406 <- here we skip resetting data and it seems the next one will be queued upon next set. And no checks are done while getting value from the cache.

Thank you very much for your work.

@isaacs isaacs closed this as completed in 9edb0b9 Jun 30, 2022
@isaacs
Copy link
Owner

isaacs commented Jun 30, 2022

Ah, I was able to reproduce it, this answers a puzzle in #3 that'd been bugging me for a while now. Fixed on 1.1.0.

Here's a little rate limiter I whipped up to reproduce:

import TTLCache from './'
import type {Options as TTLCacheOptions} from './'

export interface Options<K> extends TTLCacheOptions<K, number> {
  maxHits: number
}

export class RateLimiter<K> extends TTLCache<K, number> {
  readonly maxHits: number

  constructor (options: Options<K>) {
    options.updateAgeOnGet = false
    options.noUpdateTTL = true
    super(options)
    if (!options.maxHits || typeof options.maxHits !== 'number' || options.maxHits <= 0) {
      throw new TypeError('must specify a positive number of max hits allowed within the period')
    }
    this.maxHits = options.maxHits
  }

  // call limiter.hit(key) and it'll return true if it's allowed,
  // or false if it should be rejected.
  hit (key:K) {
    const value = (this.get(key) || 0) + 1
    this.set(key, value)
    return value < this.maxHits
  }
}

Example:

const rl = new RateLimiter<string>({ ttl: 100, maxHits: 10 })
const run = () => {
  const interval = setInterval(() => {
    const allowed = rl.hit('test')
    console.log(Date.now(), allowed, rl.get('test'))
    if (!allowed) {
      console.error('> > > > > hit rate limit')
      clearInterval(interval)
      setTimeout(run, 1)
    }
  }, 10)
}
run()

@isaacs
Copy link
Owner

isaacs commented Jun 30, 2022

By the way, though, this isn't a very clever rate limiter, so I wouldn't make it too load-bearing. You could easily do stuff like this, assuming a rate limit of 100 hits every minute:

t=0 hit once (schedule purge for t=60)
...
t=59 send 99 hits
t=60 purge
t=61 send 100 hits (made 199 hits in 2 seconds!)

Here's one that uses the TTLCache's auto-purge to keep a time-series of hits by keeping a TTLCache for each key the limiter knows about. Bit more CPU and memory usage, but probably still not too bad.

import type {Options as TTLCacheOptions} from './'
import TTLCache from './'

export interface Options {
  window: number
  max: number
}

interface RLEntryOptions extends TTLCacheOptions<number, boolean> {
  onEmpty: () => any
}

class RLEntry extends TTLCache<number, boolean> {
  onEmpty: () => any
  constructor(options: RLEntryOptions) {
    super(options)
    this.onEmpty = options.onEmpty
  }
  purgeStale() {
    const ret = super.purgeStale()
    if (this.size === 0 && ret) {
      this.onEmpty()
    }
    return ret
  }
}

class RateLimiter<K> extends Map<K, TTLCache<number, boolean>> {
  window: number
  max: number
  constructor(options: Options) {
    super()
    this.window = options.window
    this.max = options.max
  }
  hit(key: K) {
    const c = super.get(key) || new RLEntry({
      ttl: this.window,
      onEmpty: () => this.delete(key),
    })

    this.set(key, c)

    if (c.size > this.max) {
      // rejected, too many hits within window
      return false
    }
    c.set(performance.now(), true)
    return true
  }

  count (key: K) {
    const c = super.get(key)
    return c ? c.size : 0
  }
}

All these examples licensed under the same ISC license as this repo, but truly don't expect any support if they have bugs, lol

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants