My server program runs on windows(32bit) with Waitress + Pyramid.
When I tested an http API's performance with Apache ab, and the results data's size < 1M, i found the the memory keep increasing util Memory error thrown at last.
The ab command looks like below
ab -c 10 -n 100000 -k http://apiurl
The issue only happen in keep alive mode and data size is smaller than 1M.
After debug, I found the cause:
outbuf = self.outbufs
# use outbuf.__len__ rather than len(outbuf) FBO of not getting
# OverflowError on Python 2
outbuflen = outbuf.__len__()
if outbuflen <= 0:
# self.outbufs[-1] must always be a writable outbuf
if len(self.outbufs) > 1:
toclose = self.outbufs.pop(0)
'Unexpected error when closing an outbuf')
continue # pragma: no cover (coverage bug, it is hit)
# issue here, when outbuflen < 0, outbufs should be clear at least
dobreak = True
I added a line code in the marked place, then the issue could be workaround, please check if it is a reasonable fix:
dobreak = True
Create a PR for this. prune() here should not cause issues.
Only potential improvement I see here:
Instead of prune() in OverflowableBuffer we should add a .clear() that keeps the existing BytesIO or Temporary file storage in the backend and does not reset back to the string based buffer.
Temporary file storage
Although, that would penalize all other responses to possibly slower buffers due to one response being larger than 1MB.
Either way, adding the prune() is a good idea.
@bertjwregeer Thanks your confirm. A PR has been created.
Merge branch 'bugfix/prune_buffer'
Closes #111 #113 #115