New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ResponseTooBig: how to push to file rather than memory? #18
Comments
Lower level streaming API should be used in this case. But yes write to file would also be a good feature. |
@SergejJurecko On a related note, my Is there a trick to resolve this speed issue? |
Debug is unfortunately crazy slow. I'm not sure what to do about that. |
@SergejJurecko So you're saying it's these two which need removing, or hiding behind a compiler flag: https://github.com/offscale/offregisters-lib/blob/134d216/src/download.rs#L50-L58 |
No what I meant was running mio_httpc and not compiling with --release will be pretty slow. |
Okay, well I'm looking at moving to hyper then, will see if I get the expected performance improvement |
@SamuelMarks I thought @SergejJurecko meant compiling with --release should be fast? |
@SergejJurecko thanks for the library! Do you have an idea of the performance of mio_httpc vs. libraries like hyper? I am trying to find a lightweight http client with low latency with ~100 concurrent requests. |
My main goals were:
I have not measured performance against other libraries as I don't really care all that much. It is more then fast enough for anything I need it for. It is not really doing anything egregious that would be problematic. Allocations and data copying are both kept at a minimum. Individual calls are stored in a slab and accessed by index (as opposed to a hash table). |
100 concurrent requests is nothing |
Great to know, thank you! |
I'm getting a
ResponseTooBig
error. I could adjustmax_response
, but then my memory usage would increase.Could I instead 'stream' to a file, in chunks relative to speed (speed of disk IO, speed of network IO)?
The text was updated successfully, but these errors were encountered: