New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add HttpFileWrapper as a separate module #84
Conversation
This just provides a basic way to wrap a file object so you can force a specific amount of data to be returned at a minimum. So far, basic tests at the interactive console have shown it to work as expected. This still needs real tests, however. Closes #75
""" | ||
force_size = self._force_read_size | ||
read = self._file_object.read | ||
if size == -1 or size == 0 or size > force_size: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I prefer the if size in (-1, 0) or size > force_size:
idiom here.
Proposed alternative names:
|
Or we could just make this into a documentation chapter instead of bothering to write it for someone and distribute it. |
More names
|
Hey sigmavirus, thanks for pointing this out. Had a quick test with this but it seemed to make performance a bit to a lot worse, depending on the chunk size. Would I need any changes on the django backend to take advantage of this? |
@lifeofdave What chunk sizes were you using? |
@Lukasa After retesting today it turns out I was incorrect, the File Wrapper doesn't make transfer speeds worse (don't know why that seemed to be the case yesterday), but also it doesn't improve them. I've been testing with a 5MB file, and code looks like this:
Results: Using HttpFileWrapper: chunk_size = 16384 chunk_size = 32768 chunk_size = 65536 Using curl |
Given that our speeds are about the same as curl's with HTTP, the problem is unlikely to be in the read() logic in httplib. Instead, the problem seems to be in the interface with OpenSSL. What platform are you on? |
Windows 7, python 2.7.10 64 bit |
So the first thing to note is that your curl probably isn't using OpenSSL at all: what's the output of |
$ curl --version Thanks |
Ohh, but it's a cygwin thing, so it is linked against OpenSSL. That's tough. |
|
Uh..where are you getting your copy of requests from? |
I'm only using cygwin for curl, I've installed requests using the pycharm package manager, which I assume is the same as pip install requests. Requests version is 2.9.1, requests-toolbelt is 0.6.0. Sorry if that wasn't clear! |
Hmmm. Well, in that case you're using the standard library's SSL module, which is presumably the source of the majority of your slowdown. If you'd like to use something like |
Here's the openssl version (if it helps):
Is it simple to use a different SSL module? |
You can try using PyOpenSSL instead: if you install |
Same results with PyOpenSSL, I'll have a look into cprofile. Thanks so much for your help.. |
This just provides a basic way to wrap a file object so you can force a
specific amount of data to be returned at a minimum.
So far, basic tests at the interactive console have shown it to work as
expected. This still needs real tests, however.
Closes #75