You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With request_multipleis it possible to send multiple requests simultaneously.
Is it possible to set the limit for how many requests will be execute simultaneously?
Because if it's too high, requests are likely to fail more frequently or automated software may perceive as a DOS attack and automatically block further requests.
For example I have an array with 1000 urls
$requests = array( 1000 * array( 'url' => 'http://urlofsitetotest.com/), );
// Setup a callback
function my_callback(&$request, $id) {
var_dump($id, $request);
}
// Tell Requests to use the callback
$options = array('complete' => 'my_callback',)
// Set 5 requests will be execute simultaneously
// Requests::setSimultaneousLimit(5) => Does option exists ?
// Send the request!
$responses = Requests::request_multiple($requests, $options);
The text was updated successfully, but these errors were encountered:
This isn't currently supported, but would make sense to do so.
rmccue
changed the title
How to to set the limit for how many requests will be execute simultaneously
Allow setting multi-request concurrency
Jun 9, 2017
For cURL we should be using curl_multi_select and adding in new handles as the others are freed. This should be fairly straightforward to implement, but hard to test. I'd like to give it a shot.
Any ideas, preferences on naming the option? concurrent, threads (not technically), limit?
For fsockopen stream_select would actually work, but we need to rewrite the request_multiple in a better way, split out the read and parse parts. Definitely possible but out of scope of this issue, since fsockopen request_multiple is just a series wrapper around the regular request. TBD in #317
With
request_multiple
is it possible to send multiple requests simultaneously.Is it possible to set the limit for how many requests will be execute simultaneously?
Because if it's too high, requests are likely to fail more frequently or automated software may perceive as a DOS attack and automatically block further requests.
For example I have an array with 1000 urls
The text was updated successfully, but these errors were encountered: