-
-
Notifications
You must be signed in to change notification settings - Fork 6.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Curl is not handling uploads with large number of URLs (100,000+) #1959
Comments
I tried this in Windows with several recent builds and it takes about 20 seconds to parse the config with the upload options (200,000 lines) and 10 seconds to parse the config with the download options (100,000 lines). If you build with -DDEBUG_CONFIG it will show you the parsing in real time, for example I see this repeating without delay: GOT: upload-file Check for a delay in parsing the entries. edit: note I'm using debug builds of curl/libcurl, release builds would obviously be faster. |
I'm not sure if Windows curl would operate any different. But curl on macOS (comes with the OS) and the latest build of curl from GitHub I tried on debian is processing config, but it takes forever and it never finishes in reasonable time to start uploading. I do see that it's processing entries (with strace), but it's doing it kind of slow. read(4, "st/upload/34235\nupload-file /tmp"..., 4096) = 4096 |
I've also compiled it with "-pg" option on Linux and I see that it spends most of time in getparameter function. this is tool_getparam.c part where I think it spends all this time. If I get it correctly - it constantly scans list of URLs (from beginning to end) and more URLs get added - the longer it takes.
|
Yeps, that's exactly what I suspected and spotted as well. We should make the logic store the last used pointer so that it can start at that point the next time to avoid a lot of looping... |
So I tried this: replacing tool_getparam.c:1916 from
to
and it seem to fixed the issue :) I'm not a developer and I don't understand the logic with GETOUT_UPLOAD flags at all, but hey - it worked :) |
Lovely @arainchik and thanks! Someone just needs to verify that it actually is a proper fix and we can land that. This is a pretty tricky issue to add a test case for so - I'd rather not have a hundred thousand transfers as a test. I think we can skip it this time. |
@badger - you don't have to test actual 100,000 transfers. If you test and use some random, non-existing domain name or incorrect IP address like http://999.999.999.999/upload - you'll be able to test parsing config file without actual transfers. |
By properly keeping track of the last entry in the list of URLs/uploads to handle, curl now avoids many meaningless traverses of the list which speeds up many-URL handling *MASSIVELY* (several magnitudes on 100K URLs). Added test 1291, to verify that it doesn't take ages - but we don't have any detection of "too slow" command in the test suite. Reported-by: arainchik on github Fixes #1959
I did this
I'm trying to upload large number of small files (100,000 or 1,000,000) using single HTTPS connection
I expected the following
I expected curl to start uploading process in a reasonable amount of time, but upload process never starts.
If I'm trying to download files instead of uploading - curl starts processing right away, see below.
It looks like curl is not processing large number of uploads/URLs in optimal way.
curl/libcurl version
[curl -V output]
curl 7.52.1 (x86_64-apple-darwin13.4.0) libcurl/7.52.1 OpenSSL/1.0.2l zlib/1.2.8
Protocols: dict file ftp ftps gopher http https imap imaps pop3 pop3s rtsp smb smbs smtp smtps telnet tftp
Features: IPv6 Largefile NTLM NTLM_WB SSL libz TLS-SRP UnixSockets HTTPS-proxy
operating system
macOS Sierra 10.12.6
The text was updated successfully, but these errors were encountered: