Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Using `GuzzleHttp\Pool` with a huge number of requests eventually exhausts available memory #1932
Batching a large number of requests using
Memory use to rise initially, but then to level out at some point, and for all the requests to be executed.
Steps to Reproduce
See gist https://gist.github.com/garbetjie/d9ef3eb95fc5db33316d4b6799ddc07a for screenshots and source of a test script written to show the error happening.
It's not a bug.
You are using
If you don't want to collect all the responses, you should use
Yes, @alexeyshockov. You are 100% correct in that
As a side note, I tried using your solution with
Thanks so much for the help. I'll close this issue now.
I have same problem here, if i use the script in singleton format. The request eventually exhausts available memory. Although the script is same as @garbetjie .
Please take look at this https://gist.github.com/ekojs/e112a89aaf1c342d3f06115b9e14a534
Sorry to bother, i've solve the problem using yield
Please take a look in rev 1 https://gist.github.com/ekojs/e112a89aaf1c342d3f06115b9e14a534/revisions