Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed StreamReader._read_nowait #1297

Merged
merged 2 commits into from Oct 11, 2016
Merged

Conversation

@dalazx
Copy link
Contributor

@dalazx dalazx commented Oct 9, 2016

What do these changes do?

StreamReader._read_nowait was returning a single chunk from self._buffer even if a much larger chunk was requested and available. This behavior caused performance regressions.

Simple test:

import asyncio
import time

import aiohttp


async def measure(chunk_size, timeout):
    start_time = time.time()

    file_size = 0
    chunks_num = 0

    async with aiohttp.ClientSession() as session:
        async with session.get('http://localhost:8000/garbage') as response:
            response.content._b_limit = 20 * 1024 * 1024
            while True:
                chunk = await response.content.read(chunk_size)
                if not chunk:
                    break
                file_size += len(chunk)
                chunks_num += 1
                await asyncio.sleep(timeout)

    avg_chunk_size = file_size // chunks_num
    total_time = time.time() - start_time
    print(
        'req chunk size:', chunk_size,
        'avg chunk size:', avg_chunk_size,
        'timeout:', timeout,
        'total time:', total_time)


asyncio.get_event_loop().run_until_complete(measure(10 * 1024 * 1024, 0.1))
asyncio.get_event_loop().run_until_complete(measure(10 * 1024 * 1024, 0.5))

Before:

req chunk size: 10485760 avg chunk size: 202427 timeout: 0.1 total time: 53.54 s
req chunk size: 10485760 avg chunk size: 171335 timeout: 0.5 total time: 308.13 s

After:

req chunk size: 10485760 avg chunk size: 9532509 timeout: 0.1 total time: 1.18 s
req chunk size: 10485760 avg chunk size: 9532509 timeout: 0.5 total time: 5.58 s

Are there changes in behavior for the user?

Related issue number

Checklist

  • I think the code is well written
  • Unit tests for the changes exist
  • Documentation reflects the changes
  • Add yourself to CONTRIBUTORS.txt
  • Add a new entry to CHANGES.rst
    • Choose any open position to avoid merge conflicts with other PRs.
    • Add a link to the issue you are fixing (if any) using #isuue_number format at the end of changelog message. Use Pull Request number if there are no issues for PR or PR covers the issue only partially.
dalazx added 2 commits Oct 9, 2016
@codecov-io
Copy link

@codecov-io codecov-io commented Oct 9, 2016

Current coverage is 98.53% (diff: 100%)

Merging #1297 into 1.0 will increase coverage by <.01%

@@                1.0      #1297   diff @@
==========================================
  Files            29         29          
  Lines          6545       6553     +8   
  Methods           0          0          
  Messages          0          0          
  Branches       1095       1097     +2   
==========================================
+ Hits           6449       6457     +8   
  Misses           45         45          
  Partials         51         51          

Powered by Codecov. Last update 3ea78fb...a951513

@dalazx
Copy link
Contributor Author

@dalazx dalazx commented Oct 9, 2016

@jettify I guess I found a way to speed up thing further in our particular use case. In the example above there is this line response.content._b_limit = 20 * 1024 * 1024 that changes the limit in FlowControlStreamReader and it makes a difference in the amount of data buffered in the background (64k by default which is too little). The bad thing is that it is not exposed (I have not found the way to publicly set it to whatever I want). Next step is to expose it somehow in a follow-up PR.

@asvetlov asvetlov merged commit 869e133 into aio-libs:1.0 Oct 11, 2016
4 checks passed
4 checks passed
codecov/patch 100% of diff hit (target 98.53%)
Details
codecov/project 98.53% (+<.01%) compared to 3ea78fb
Details
continuous-integration/appveyor/pr AppVeyor build succeeded
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
@asvetlov
Copy link
Member

@asvetlov asvetlov commented Oct 11, 2016

Thanks!

@lock
Copy link

@lock lock bot commented Oct 29, 2019

This thread has been automatically locked since there has not been
any recent activity after it was closed. Please open a new issue for
related bugs.

If you feel like there's important points made in this discussion,
please include those exceprts into that new issue.

@lock lock bot added the outdated label Oct 29, 2019
@lock lock bot locked as resolved and limited conversation to collaborators Oct 29, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

3 participants
You can’t perform that action at this time.