-
-
Notifications
You must be signed in to change notification settings - Fork 6.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please add support for multiple concurrent requests #5774
Comments
Thanks, but this description sounds as if you're asking for a new feature/change. We use this tracker for bugs and issues only, we put ideas to work on in the future in the TODO document. We basically drown in good ideas so they don't do much use in our tracker. If you really want to see this happen, start working on an implementation and submit a PR for it or join the mailing list and talk up more interest for it and see what help from others you can get! |
I can't find where the email is, because I see that curl supports range segmentation requests, and I can manually request multiple concurrent requests for a file, and then merge it, so that we can get higher through more connections The transmission rate is good. |
I agree. I don't think it is very difficult. My ideal vision of an implementation would also not use a specified number of parallel transfers, but curl could:
This way, if transfer B fails (because N and M could be set to some sensible defaults. Still: this is not in my short term plan to work on personally. |
Your ideas seem to be more advanced, and more friendly and smart. |
I am using aria2 to download more concurrently recently, but if the bandwidth exceeds 200Mbps, it becomes unstable. |
Now that curl has parallel download support. Is it possible to add different byte range headers per concurrent download in the config file? For example if i put this in my config.txt config file: url = "http://www.example.com" header = "Range: bytes=0-1024" output = "chunk1" url = "http://www.example.com" header = "Range: bytes=1024-2048" output = "chunk2" url = "http://www.example.com" header = "Range: bytes=2048-3072" output = "chunk3" and run
I get 3 chunks downloaded concurrently but always the first byte range 0-1024 instead of each specifc byte range per file. Is it impossible or is my config file incorrect? Thank you for your help. |
You probably need the "next" option between requests to make them use separate
headers.
|
@dfandrich You are my hero :) I put --next between each request in the config file and it worked!!! |
However, I think it is challenging if storage random access performance is bottleneck, e.g. hard disk or external storage. Regular single connection transfer guarantees sequential writing, but it is not the case for parallel transfer. The aria2 has many options to tune how data is written into storage, but I could not find an optimal solution, or aria2 cannot provide satisfactory balance among storage performance, re-connection overhead, and memory usage: piece selection algorithm |
But then it won't work in parallel..? "If you’re doing parallel transfers (with -Z), curl will do URLs in parallel only until the --next separator." - https://daniel.haxx.se/blog/2020/03/02/curl-ootw-next/ |
I think what @AndriusCTR is meaning is segmented downloads, the below is a good description of it and how its different from something like multithreading |
Well no, I don't mean segmented downloads. I mean parallel downloads: get 10 URLs at once, but those URLs require 10 different cookies (for request headers). |
Now curl supports segmented requests through --range, but all this must be manual.
I can use
Segment request
then,
cat part1 part2 part3 >> logo.png
Merge, so that I can use three tcp connections to download files at the same time, and strive for greater bandwidth,
It would be great if these could all be automatic,
Such as
curl -X 3 https://www.baidu.com/img/bd_logo1.png -o logo.png
It will be very happy
The text was updated successfully, but these errors were encountered: