-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Download Rate Limiting & Concurrent Downloads #89
Comments
Unfortunately no, I can expose the max number of simultaneous downloads (
which get 1 worker each ) but the module used doesn't allow rate limiting
and adding the capacity to the requests module is out of scope.
…On Tue, 26 Sept 2023, 02:56 romen-h, ***@***.***> wrote:
I was wondering if it would be possible to add arguments to control the
following behaviour:
-
Download rate limit
Something like -ratelimit 10 which would ensure the download rate of
any individual worker does not exceed 10 MB/s.
-
Concurrent downloads (workers)
Something like -workers x which would ensure x workers are spawned to
do the downloads in parallel.
These two arguments would allow me to use this script like a scheduled
task or service that does not consume all of my bandwidth in the
background. Downloading one file at a time also helps with resuming if the
script is terminated since there will be only one partial file that needs
to be re-downloaded instead of 4 partial files from cancelling.
—
Reply to this email directly, view it on GitHub
<#89>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZ33YHQAVP2A4D7LBSPJ3X4GZUPANCNFSM6AAAAAA5GMTNOU>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thanks for responding. I will try to find some external way to rate limit just this script, maybe with domain based QoS or something. I'm still interested in the argument to set the number of workers, but even that might not be necessary if download resuming works. Downloading one file at a time would mean only one part has to be restarted and more parts may have downloaded successfully before an interruption. |
Also if you point me to the lines where the HTTP requests for the "chunks" are happening then I might be able to implement this and PR it if it works. Apparently it's easy to implement rate limiting with the python requests module; You just put a sleep between GET requests for chunks so that your overall rate will not exceed |
If you're okay with that I can implement it but it's a very coarse rate
limit, so you should expect to go over your set limit, significantly so, at
times.
…On Fri, 29 Sept 2023, 07:29 romen-h, ***@***.***> wrote:
Also if you point me to the lines where the HTTP requests for the "chunks"
are happening then I might be able to implement this and PR it if it works.
Apparently it's easy to implement rate limiting with the python requests
module; You just put a sleep between requests for HTTP chunks so that your
overall rate will not exceed chunk_size / sleep_time.
—
Reply to this email directly, view it on GitHub
<#89 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABKZ334K7MB3RBKQUS3H47DX4XT43ANCNFSM6AAAAAA5GMTNOU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
I was wondering if it would be possible to add arguments to control the following behaviour:
Download rate limitSomething like-ratelimit x
which would ensure the download rate of any individual worker does not exceed x MB/s.Concurrent downloads (workers)
Something like
-workers x
which would ensure x workers are spawned to do the downloads in parallel.These two arguments would allow me to use this script like a scheduled task or service that does not consume all of my bandwidth in the background. Downloading one file at a time also helps with resuming if the script is terminated since there will be only one partial file that needs to be re-downloaded instead of 4 partial files from cancelling.
The text was updated successfully, but these errors were encountered: