Sending a request with a large request body (i.e. a large file that is being multipart-encoded) will consume a large amount of memory.
It would be very helpful if it were possible to specify a file-like object (and the content-length separately) and request as well as its helper libraries read only the chunks that are needed to continue sending the request.
This may be related to shazow/urllib3#51
The plan is to support generators. Work in progress :)
Thanks for letting me know. Just to confirm, did you close the ticket because you're not tracking features as issues or you're currently working on it? Is it appropriate to raise new features as issues in the future?
Oh, I should have mentioned #295 :)