The connection to the host is kept open for subsequent requests. The problem is that the API like most APIs these days is request rate limited, and the --limit-rate command does not help as the files returned from the API can be anywhere from <10k to several MB in size. If API calls results in several small files in a row, curl's performance can cause the next API call to be rejected due to rate limits - in this case no more than 2 per second. On the other hand, if I set the limit rate to an artificially low value to handle with these small files, curl's performance on the larger files is lousy.
The situation is not new and as been raised in the past with no resolution that I can find. From the curl mail archives in Sept 2017:
The curl tool master client github wiki page shows the style of script that exists everywhere as a result of this limitation.
for i in 1 .. 100000; do
curl https://example.com/api?input= i yadayada
sleep X seconds
I'd like to avoid writing and maintaining such an external script just control curl's request rate. Instead, I'd like to see an option added to curl that limits the number of requests on a per host basis to no more that X requests in a given interval period. The number of requests and the interval should be user settable as the API rate limits vary widely in both request limits and time intervals.
The text was updated successfully, but these errors were encountered:
A simpler to use approach would be to simply ask for at least NNN milliseconds between each new request. Then you could for example specify two requests per second with something like (pretending the option is called --after):
I agree that different options per host name would be difficult... and potentially disruptive. Your proposed approach would fix the most common usage case of calling a single API endpoint where the API rate limits are known ahead of time. Hopefully this simpler solution would be easier to implement.
changed the title
Request limit enhancement - limit number of requests per host for x intervalMay 22, 2019