Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve VideoClip testing by inspecting numpy frame arrays #1124

Merged
merged 23 commits into from
May 3, 2020

Conversation

tburrows13
Copy link
Collaborator

@tburrows13 tburrows13 commented Apr 1, 2020

This initial post is outdated. I have changed the approach of this PR since.

This is an attempt to improve the test suite, by addressing #1114.

Inside of tests/test_helper, I have included 4 functions (though only 2, bitmap_to_clip and clip_frames_equal, should really be needed by the test suite, they each depend on 1 other).

This allows easy testing of the actual numpy frames that a VideoClip contains, for example, here is part of the test for the fx margin(), which adds a border around a clip:

    # 2 frames, each one 3x2 pixels
    clip = bitmap_to_clip([["RRR", 
                            "RRR"], 
                           ["RRB", 
                            "RRB"]])    
    # 1 pixel black margin ("O" means black)
    clip2 = margin(clip, mar=1)
    
    # 2 frames, each one 5x4 pixels
    target = bitmap_to_clip([["OOOOO", 
                              "ORRRO", 
                              "ORRRO", 
                              "OOOOO", ],
                             ["OOOOO", 
                              "ORRBO", 
                              "ORRBO", 
                              "OOOOO", ],
                             ])
    assert clip_frames_equal(target, clip2)

Unfortunately black auto formats the bitmap lists, so in the actual code it isn't quite as obvious what a given frame looks like.

This PR is a work-in-progress. I'll keep adding in tests.

Points for discussion:

  • I think that the test_helper functions could be improved. Does the implementation make sense? Is the usage obvious?

  • The clip_frames_equal function could be put as a general purpose __eq__() method in VideoClip. It's not very performant because it iterates through every frame in the VideoClip, but this could be useful?

  • This approach will not work for all fx, particularly ones that rely on a gradient of colours (eg headblur(). However it can be applied to tests that are outside of fx as well.

  • I have added suitable tests to the test suite in tests/

  • I have properly documented new or changed features in the documention or in the docstrings

  • I have properly documented unusual changes to the code in the comments around it

  • I formatted my code using black -t py36

@tburrows13 tburrows13 added tests Related to individual tests in the test suite or running the test suite. WIP labels Apr 1, 2020
@coveralls
Copy link

coveralls commented Apr 1, 2020

Coverage Status

Coverage increased (+1.3%) to 66.14% when pulling 4a09124 on tburrows13:test-coverage into 850a40e on Zulko:master.

:return:
"""
frame_list = bitmap_to_frame_list(bitmap_frames, color_dict=color_dict)
make_frame = lambda t: frame_list[t]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as a rule of thumb, don't assign lambdas to variables names. because the it spirit is to be "anonymous". Pass it directly to make_frame argument or, better, define it properly with def.

btw, this would be warning in a pycodestyle / flake8 linting. shouldn't we use it in our CI ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will do, thanks for catching that one. They are caught by flake8 in the travis build, but not alerted anywhere because there are 500 or so warnings. (https://travis-ci.org/github/Zulko/moviepy/jobs/669677296#L381). Perhaps something to look at and decide which warnings to alert, and which to ignore?

tests/test_helper.py Outdated Show resolved Hide resolved
return li


def clip_frames_equal(clip1, clip2):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

two observations here. Firstly, I guess this is basically what numpy's assertions does

https://docs.scipy.org/doc/numpy/reference/generated/numpy.testing.assert_equal.html
and
https://docs.scipy.org/doc/numpy/reference/generated/numpy.testing.assert_allclose.html#numpy.testing.assert_allclose

Anyway, an interesting approach would be to define __eq__ in the Clip class, making this frame by frame comparison, and implement the verbose output at pytest level
https://docs.pytest.org/en/latest/assert.html#defining-your-own-explanation-for-failed-assertions

tests/test_helper.py Outdated Show resolved Hide resolved
@tburrows13
Copy link
Collaborator Author

I've just pushed a rather major rewrite of this PR, partially based on @mgaitan's feedback. The main feature is a new BitmapClip(), which takes over the role of bitmap_to_clip that I had in the first iteration of this PR. Hopefully the new API makes sense to others, and it might be useful in general usage as well.

It also includes Clip.__eq__() instead of my clip_frames_equal(), which also seems like a better API.

A small bug fix for the fx freeze when t="end" is used, it was previously off by one frame.

Finally, more tests have been updated to the new system, which run a lot faster (eg test_crop() previously wrote 6 1 second videos to disk, taking around 16 seconds on my machine. Now it runs in less than a second and actually checks that it is cropping the right things).

Points for discussion:

  • BitmapClip API and implementation. Does it contain everything that it should? What about the set_fps() and set_duration() methods each setting the other attribute as well. Is this intuitive or is there a better way?
  • Clip equality implementation.

This is still a WIP, I'd like to add further tests.

@tburrows13 tburrows13 added the feature New addition to the API i.e. a new class, method or parameter. label Apr 4, 2020
@tburrows13 tburrows13 added this to the Release v2.0.0 milestone Apr 5, 2020
tburrows13 and others added 15 commits April 24, 2020 02:09
Fix bitmap_to_clip duration
Change test_aspect_ratio to use new helpers
* Extend scroll docstring [WIP]
* Add BitmapClip to VideoClip.py (and import into editor.py)
* Add clip.__eq__() to allow frame by frame equality comparisons
* New tests for crop, invert_colors, mirror_x, mirror_y
Rename t -> duration
Fix bitmap_to_clip duration
Change test_aspect_ratio to use new helpers
* Extend scroll docstring [WIP]
* Add BitmapClip to VideoClip.py (and import into editor.py)
* Add clip.__eq__() to allow frame by frame equality comparisons
* New tests for crop, invert_colors, mirror_x, mirror_y
@tburrows13
Copy link
Collaborator Author

tburrows13 commented Apr 24, 2020

There's always more tests that can be written, but I've done enough for this PR, so I'm happy for it to be merged. In particular, I've not included any tests for masks.

With this PR, the time taken for test_fx.py to run on my computer has gone from 4 minutes to 1 minute (it saves Travis about 50 seconds). On top of that, most effects are actually being checked to see if they perform correctly, and some effects were not being tested at all before.

Whilst creating these tests, I found a bug that caused even_size() to actually create clips of odd sizes, and an error/warning in freeze() and time_mirror() caused by not zero-indexing the effect. This was causing warnings in all of our Travis builds before.
I have also made rotate() not require PIL and instead use the more efficient numpy rotations for any multiple of 90 degrees, instead of limiting it to just -180, -90, 90, 180.

@tburrows13 tburrows13 linked an issue Apr 24, 2020 that may be closed by this pull request
@tburrows13 tburrows13 merged commit 90e42c1 into Zulko:master May 3, 2020
@tburrows13 tburrows13 deleted the test-coverage branch May 14, 2020 20:15
@mondeja mondeja mentioned this pull request Jan 21, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New addition to the API i.e. a new class, method or parameter. tests Related to individual tests in the test suite or running the test suite.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Assertions in tests are very fragile
3 participants