Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for python3 #332

Merged
merged 12 commits into from Jun 11, 2019
Merged

Support for python3 #332

merged 12 commits into from Jun 11, 2019

Conversation

zsimic
Copy link
Contributor

@zsimic zsimic commented Jun 3, 2019

Adapted code for python3, exercise tests in both python2 and python3

Copy link
Contributor Author

@zsimic zsimic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be ready to go.

  • equivalent in 2.7
  • probable bugs left to address in 3.6 (especially around pickled resources, and YuvReader)

@@ -1,22 +1,20 @@
language: python
python:
- "2.7"
- "3.6"
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Travis now exercises tests under both 2.7 and 3.6
All tests pass like before this change under 2.7
Under 3.6, a few tests are currently skipped because I'm not sure how to fix them

So end result after this commit should be:

  • 2.7 should behave the same as before
  • 3.6 has probably a few bugs to iron out, no tests should be skipped because of 3.6 before we can call it really ready for python3

All tests pass with both python2 and python3, however, a few test cases are disabled currently for python3, spot then with:

```bash
grep -r 'reason="TODO python3' .
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Search for TODO python3 to see what still needs to be examined for python3 (all relevant parts of the code were marked with this, everything else I'm pretty sure is good to go)

:target: https://ci.appveyor.com/project/li-zhi/vmaf
:alt: AppVeyor Build Status

VMAF is a perceptual video quality assessment algorithm developed by Netflix.
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is just a placeholder README, there was no README before in this folder, I added one to make setup.py be closer to what it would eventually need to be if we one day publish this to pypi.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good.

@@ -0,0 +1,9 @@
numpy
# TODO python3: scipy 1.3.0 removed some previously deprecated functions, still used by vmaf
scipy>=0.17.1,<1.3.0
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

scipy change is indirectly related to py3
Their latest version 1.3.0 is python3 only (they dropped support for python2).
See https://github.com/scipy/scipy/blob/master/setup.py#L490 (python_requires='>=3.5')

This makes it so that vmaf picks up scipy 1.2.1 (because that's the latest version it can pick up...), but python3 ends up getting 1.3.0, where they dropped some previously deprecated functions.



if __name__ == "__main__":
main()
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not directly related to py3... but it creates confusion (in the doctest module actually) because of code executed immediately at import time.

I changed nothing here, except to make sure code doesn't get executed at import time.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Good catch.

Curious: how did you find this file? Do you have some analysis tool or you just did spot check?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pytest --doctest-modules found it actually -> it was hitting this because it tries to simply import the module (in order to scour it for doctests), and this one was trying to run immediately at import time, and failing.

@@ -8,13 +8,15 @@
__copyright__ = "Copyright 2016-2018, Netflix, Inc."
__license__ = "Apache, Version 2.0"

class TestLibRunner(VmafossExecQualityRunner):

class LibRunner(VmafossExecQualityRunner):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Renamed to LibRunner because otherwise TestLibRunner issues a warning that this is not a valid test case (which it isn't supposed to be... it's just that class prefix Test seems to imply that it is by convention...)

@@ -28,7 +28,7 @@ def test_with(self):
height=324,
yuv_type='yuv420p'
) as yuv_reader:
self.assertEquals(yuv_reader.file.__class__, file)
assert hasattr(yuv_reader.file, "read")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is no more type file in python3

@@ -113,7 +113,7 @@ def test_to_score_str(self):
self.result.set_score_aggregate_method(ListStats.total_variation)
self.assertAlmostEquals(self.result.get_result('VMAF_legacy_score'), 6.5901873052628375, places=4)
self.result.set_score_aggregate_method(partial(ListStats.moving_average, n=2))
self.assertItemsEqual(self.result.get_result('VMAF_legacy_score'),
self.assertEqual(list(self.result.get_result('VMAF_legacy_score')),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few of those assertSomething methods were removed in python3's unittest...

The consensus is roughly that there are too many of them, they're a lot java-like, and don't really provide much value...

With a modern test runner (like pytest), you don't need this at all.
A simple assert a == b will usually be sufficient (you will get a diff output of the difference too, no need for elaborate functions)

-rtest/requirements.txt
# TODO: way too many warnings in tests, remove `-p no:warnings` and get rid of them all
commands = pytest {posargs:-vv -p no:warnings -m {env:TEST_MARKER:main} --doctest-modules --cov-report term-missing --cov=src/}
# python -m unittest discover -s test/ -p '*_test.py'
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One can simply comment the pytest line here, and uncomment the unittest one to switch to unittest to exercise the tests

markers =
main: Main tests (exludes 'extra' and 'lib')
extra: Exercise test/extra/ (requires ffmpeg installed)
lib: Exercise test/lib/ (requires testlib installed).
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These just define all the markers used, pytest --markers would show those 3 (with their description), it's like a relatively common practice to declare them like this.

pytest looks at the following file for configuration (in that order): pytest.ini, tox.ini, setup.cfg

@li-zhi li-zhi merged commit d54e026 into Netflix:master Jun 11, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants