Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Queue rate limiting #7

Closed
jasrusable opened this issue Jun 9, 2020 · 3 comments
Closed

Queue rate limiting #7

jasrusable opened this issue Jun 9, 2020 · 3 comments
Assignees
Labels
enhancement New feature or request

Comments

@jasrusable
Copy link
Contributor

Description

This task is for implementing rate-limiting in Conveyor MQ and assessing the various implementation options and their trade-offs.

Implementations

Queues can either be rate-limited on the producer side or the consumer side.

Producer side

Rate limiting on the producer side requires that tasks be scheduled according to the rate limit constraints, and then enqueued by an orchestrator.

Pros

  • Task rate is centrally constrained in the queue
  • No complexity in consumers/workers

Cons

  • Stateful
  • Complexity in enqueuing/scheduling task
  • Bound by orchestrator interval

Consumer side

Rate limiting on the consumer side requires that tasks are only taken from the queue for processing at the required rate by having the consumers/workers coordinate when they can and cannot take a task and process it.

Pros

  • Not stateful
  • No complexity when enqueuing/scheduling tasks, or orchestrating the queue.

Cons

  • Extra overhead is added to consumers/workers to coordinate when the can & cannot take a task to process, + state for this.
@jasrusable jasrusable self-assigned this Jun 9, 2020
@jasrusable jasrusable added the enhancement New feature or request label Jun 9, 2020
@jasrusable jasrusable mentioned this issue Jun 10, 2020
@eugene1g
Copy link

eugene1g commented Jun 12, 2020

Might be related: in Node, I'm happily using https://github.com/vercel/async-sema for rate-limiting and https://github.com/Vincit/tarn.js for resource pools.

(Not suggesting that this is a user-space problem: just mentioning libs that might help with adding this feature to Conveyor)

@jasrusable
Copy link
Contributor Author

Might be related: in Node, I'm happily using https://github.com/vercel/async-sema for rate-limiting and https://github.com/Vincit/tarn.js for resource pools.

(Not suggesting that this is a user-space problem: just mentioning libs that might help with adding this feature to Conveyor)

Hi @eugene1g :) Thanks for mentioning these, this is most certainly useful knowing about Tarn.js and async-sema and keeping this in mind when thinking about how to solve rate-limiting in Conveyor.

I have also been doing some reading on the following articles with regards to designing and building a distributed rate limiter for Conveyor, these have been quite useful so far:
https://engineering.linecorp.com/en/blog/high-throughput-distributed-rate-limiter/
https://dzone.com/articles/building-a-distributed-rate-limiter-that-scales-ho
https://tyk.io/docs/basic-config-and-security/control-limit-traffic/rate-limiting/

I am going to apply my mind over the weekend and hopefully have a prototype ready in the next couple of days.

@jasrusable
Copy link
Contributor Author

This has been implemented using the consumer side option, making use node-rate-limiter-flexible

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants