Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tranfer rate limit #797

Closed
jkhunter opened this issue Oct 18, 2018 · 4 comments
Closed

tranfer rate limit #797

jkhunter opened this issue Oct 18, 2018 · 4 comments
Labels
question A question (converts to discussion)

Comments

@jkhunter
Copy link

Hello, is there an option to limit a transfer rate of a response. I read the documentation about streams and though it would be possible to implement a custom read interface for limiting the rate. But i'm not sure if this will be the best solution, because the read interface implementation have to be blocking to archive the limit
Is there a better solution?
Tanks in advance

@jebrosen
Copy link
Collaborator

Limiting the transfer rate is not usually desirable, especially not when using worker threads. This (might) be easier / less horrible under async, but I'm still not sure why this is desirable in the first place. Do you have a particular motivation for doing so?

@jkhunter
Copy link
Author

Thanks for the response.

Imagine a streaming service like netflix. They probably limit transfer rate, so a user will download as much video as needed plus some buffer. It's not desired t download the whole video in a second. But only the parts that are needed plus some buffer.

A second example: A quality of service with guaranteed transfer rate. To achieve this it is needed to limit all workers to a lower rate. Forbid any worker to use the full bandwidth.

And third: Look here

Limiting transfer rate is a regularly functionality for QoS.

@jebrosen
Copy link
Collaborator

Streaming video and similar use cases should be using Range requests; they are not facilitated directly by Rocket but they are not very difficult to implement manually where needed.

Limiting transfer rate for QoS and throttling purposes is a compelling use case. I think it's specialized enough that it lies outside the scope of Rocket itself, but it could easily fit into rocket_contrib if a high-quality implementation existed.

Rocket will support async at some point in the future, and it will almost certainly be possible to implement a proper Throttled Responder type then. Your best options today (that I've thought of so far) are: 1) implement a custom read interface or 2) place Rocket behind something like nginx or Apache and configure throttling there. You would need to work around the thread-blocking problem, perhaps by adding more worker thread capacity, but it's not a very good solution.

@jebrosen jebrosen added the question A question (converts to discussion) label Oct 22, 2018
@SergioBenitez
Copy link
Member

Indeed, as @jebrosen notes, currently, you must implement a custom Reader or use something like nginx or Apache in front of Rocket to handle this. Let's keep this in mind for #17.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question A question (converts to discussion)
Projects
None yet
Development

No branches or pull requests

3 participants