leaking file descriptors in s3 GetObject #408
Labels
documentation
This is a problem with documentation.
feature-request
A feature should be added or improved.
This is how I can reproduce:
ulimit -n N
).S3.GetObject()
for N different keys that exist.S3.GetObject()
for a different object and an error about too many open files will appear.Are we supposed to free up resources or reuse connections manually? I don't see it anywhere in the docs.
The text was updated successfully, but these errors were encountered: