Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please add support for multiple concurrent requests #5774

Closed
f4nff opened this issue Aug 4, 2020 · 5 comments
Closed

Please add support for multiple concurrent requests #5774

f4nff opened this issue Aug 4, 2020 · 5 comments

Comments

@f4nff
Copy link

f4nff commented Aug 4, 2020

Now curl supports segmented requests through --range, but all this must be manual.
I can use

curl -D "dump1.txt" -H "Range: bytes=0-2000" https://www.baidu.com/img/bd_logo1.png -o part1
curl -D "dump2.txt" -H "Range: bytes=2001-4000" https://www.baidu.com/img/bd_logo1.png -o part2
curl -D "dump3.txt" -H "Range: bytes=4001-" https://www.baidu.com/img/bd_logo1.png -o part3

Segment request
then,
cat part1 part2 part3 >> logo.png
Merge, so that I can use three tcp connections to download files at the same time, and strive for greater bandwidth,
It would be great if these could all be automatic,

Such as

curl -X 3 https://www.baidu.com/img/bd_logo1.png -o logo.png
It will be very happy

@bagder
Copy link
Member

bagder commented Aug 4, 2020

Thanks, but this description sounds as if you're asking for a new feature/change. We use this tracker for bugs and issues only, we put ideas to work on in the future in the TODO document. We basically drown in good ideas so they don't do much use in our tracker.

If you really want to see this happen, start working on an implementation and submit a PR for it or join the mailing list and talk up more interest for it and see what help from others you can get!

@f4nff
Copy link
Author

f4nff commented Aug 4, 2020

I can't find where the email is, because I see that curl supports range segmentation requests, and I can manually request multiple concurrent requests for a file, and then merge it, so that we can get higher through more connections The transmission rate is good.
On this basis, it seems not difficult to implement multiple concurrent requests. I very much hope to see the realization of this feature.

@bagder
Copy link
Member

bagder commented Aug 4, 2020

it seems not difficult to implement multiple concurrent requests

I agree. I don't think it is very difficult. My ideal vision of an implementation would also not use a specified number of parallel transfers, but curl could:

  • Get the full file as transfer A
  • If after N seconds have passed and the transfer is expected to continue for >M seconds more, add a new transfer (B) that asks for the second half of A's content (and stop A at the middle).
  • if splitting up the work seemed to improve the transfer rate, it could then be done again. Then again, etc up to a limit.

This way, if transfer B fails (because Range: isn't supported) it will let transfer A remain the single one.

N and M could be set to some sensible defaults.

Still: this is not in my short term plan to work on personally.

@f4nff
Copy link
Author

f4nff commented Aug 4, 2020

Your ideas seem to be more advanced, and more friendly and smart.
In the process of Internet transmission, one thing is very practical,
That is, if you download multiple files concurrently, you can get more bandwidth.
Many service providers limit the speed of a single connection. If there are multiple connections, it will lead to shorter and more efficient transmissions.
I look forward to your perfecting it.

@bagder bagder closed this as completed in 532dfa3 Aug 4, 2020
@f4nff
Copy link
Author

f4nff commented Sep 27, 2020

I am using aria2 to download more concurrently recently, but if the bandwidth exceeds 200Mbps, it becomes unstable.
Curl single concurrency is very stable. The only thing is that it does not support multiple concurrency. I look forward to supporting multiple concurrency.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants