Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase test coverage for stream worker #44161

Merged
merged 15 commits into from
Feb 1, 2021

Conversation

allenporter
Copy link
Contributor

@allenporter allenporter commented Dec 13, 2020

Proposed change

Increase test coverage of stream worker from ~80% to 92% in preparation for adding support for expiring nest urls.

Exercise the stream worker functionality by mocking av.open calls to return a
fake media container as well a fake decoded stream in the form of a series of
packets. Each test creates a packet sequence, with a mocked output buffer to capture the segments
pushed to the output streams. The packet sequence can be used to exercise
failure modes or corner cases like how out of order packets are handled.

This is in preparation for changing the stream behavior to support expiring urls, and
there is not a lot of existing test coverage. This test adds additional coverage for corner cases that is not possible to achieve from test_hls.py because this can create invalid streams that pyav cannot encode.

Type of change

  • Dependency upgrade
  • Bugfix (non-breaking change which fixes an issue)
  • New integration (thank you!)
  • New feature (which adds functionality to an existing integration)
  • Breaking change (fix/feature causing existing functionality to break)
  • Code quality improvements to existing code or addition of tests

Example entry for configuration.yaml:

# Example configuration.yaml

Additional information

Checklist

  • The code change is tested and works locally.
  • Local tests pass. Your PR cannot be merged unless tests pass
  • There is no commented out code in this PR.
  • I have followed the development checklist
  • The code has been formatted using Black (black --fast homeassistant tests)
  • Tests have been added to verify that the new code works.

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • The manifest file has all fields filled out correctly.
    Updated and included derived files by running: python3 -m script.hassfest.
  • New or updated dependencies have been added to requirements_all.txt.
    Updated by running python3 -m script.gen_requirements_all.
  • Untested files have been added to .coveragerc.

The integration reached or maintains the following Integration Quality Scale:

  • No score or internal
  • 🥈 Silver
  • 🥇 Gold
  • 🏆 Platinum

To help with the load of incoming pull requests:

@probot-home-assistant
Copy link

Hey there @hunterjm, @uvjustin, mind taking a look at this pull request as its been labeled with an integration (stream) you are listed as a codeowner for? Thanks!
(message by CodeOwnersMention)

tests/components/stream/test_worker.py Outdated Show resolved Hide resolved
@allenporter
Copy link
Contributor Author

Based on discussion in home-assistant/architecture#482 it seems like these tests will still be useful. However, I think it might be a mistake to test these in terms of the Stream class directly rather than in terms of the "public" interfaces of the component. This will also help if we need to do any refactoring when introducing new abstractions.

@allenporter
Copy link
Contributor Author

Thinking about it more, I think this approach may still have some value as (1) testing via the Stream class is close enough to the top level interface, and matches other tests and (2) I considered trying to exercise these corner cases using a similar strategy as test_hls.py however pyav won't let me encode videos in the format that these corner cases are exercising (e.g. out of order pts/dts, etc). Another approach to consider could be to refactor the code into simpler abstractions to let us test these corner cases independent of pyav (e.g. a code that just operates in terms of processing packets) however I think we'd want tests in place before considering that. So, I think this makes sense to review still.

@allenporter
Copy link
Contributor Author

Swapping reviewers based on conversation in #45431

@uvjustin i'd like your take on if this is helpful. I worry about these subtle corner cases not being fully tested, and i'm pretty bad at manually testing things.

My only worry about this is it gets deep into the pyav details so the test harness may be more brittle than would be ideal. Another thought I have is that we can split the responsibilities of the worker a bit: into a few components to make unit testing in terms of packets easier. That is, imagine the worker is only decoding the stream and pushing packets into another class that handles all breaking into segments, verifying ordering, etc -- then we write these tests in terms of that interface that accepts streams of packets. (Though to refactor that, i'd want this test coverage first to verify it worked)

@uvjustin
Copy link
Contributor

@allenporter I think this is very good - the tests capture/codify most of the problem workarounds we put in the worker (did you dig through the old issues to get these or were you able to deduce them from the code?) Even if we need to change these workarounds at least this harness will make clear that we are only changing what we intend to change.
We could refactor the worker as long as it doesn't add too much overhead. I don't think it's overly complex as is but maybe your refactor will still be cleaner.

@allenporter
Copy link
Contributor Author

@allenporter I think this is very good - the tests capture/codify most of the problem workarounds we put in the worker (did you dig through the old issues to get these or were you able to deduce them from the code?) Even if we need to change these workarounds at least this harness will make clear that we are only changing what we intend to change.
We could refactor the worker as long as it doesn't add too much overhead. I don't think it's overly complex as is but maybe your refactor will still be cleaner.

Ok thank you for taking a look. I got these from the test coverage gaps / code, looking at the PRs and reading the previous issues, yes.

@uvjustin
Copy link
Contributor

Maybe add one more test where we leave the dts in order but scramble up the pts to ensure that we are only enforcing proper ordering on the dts side?

@balloob
Copy link
Member

balloob commented Jan 29, 2021

@uvjustin don't forget to hit that approve button if you think it's ok 👍

@allenporter allenporter requested review from uvjustin and removed request for hunterjm January 30, 2021 18:28
@allenporter
Copy link
Contributor Author

Looking good, I have a few more ideas to simplify further building one why you've done.

@uvjustin
Copy link
Contributor

uvjustin commented Feb 1, 2021

Looking good, I have a few more ideas to simplify further building one why you've done.

Ok, I'll wait for those.

@allenporter
Copy link
Contributor Author

OK, this is ready for another look.

@uvjustin
Copy link
Contributor

uvjustin commented Feb 1, 2021

Just made a minor change to use some more realistic values for time_base. Does that look ok to you?

@allenporter
Copy link
Contributor Author

Yeah, that looks good to me.

Copy link
Contributor

@uvjustin uvjustin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good!

@uvjustin
Copy link
Contributor

uvjustin commented Feb 1, 2021

I had to add back PACKETS_PER_SEGMENT to fix the modulo checks as 1ddb138 ended up changing those inadvertently. I renamed PACKETS_TO_SEGMENTS just for consistency.

@allenporter
Copy link
Contributor Author

OK, looks good. Want to also approve?

@balloob balloob merged commit 2136b30 into home-assistant:dev Feb 1, 2021
@allenporter allenporter deleted the stream_tests branch February 1, 2021 17:39
raman325 pushed a commit to raman325/home-assistant that referenced this pull request Feb 1, 2021
Co-authored-by: Justin Wong <46082645+uvjustin@users.noreply.github.com>
c0fec0de pushed a commit to c0fec0de/home-assistant that referenced this pull request Feb 1, 2021
Co-authored-by: Justin Wong <46082645+uvjustin@users.noreply.github.com>
@github-actions github-actions bot locked and limited conversation to collaborators Feb 2, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants