Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip Decorators: fix issues when setUp or the test function would/would not be run #4528

Conversation

clebergnu
Copy link
Contributor

There are two different issues with the skip decorators fixed here:

  1. the condition for executing setUp() methods were not being evaluated dynamically
  2. where no skip would be performed, the test function would also not be performed

Signed-off-by: Cleber Rosa <crosa@redhat.com>
When no condition for skip exists, the test should *not* be skipped,
that is, it should run.  But, the current decorator was not executing
the decorated function.

This was being masqueraded because the default behavior for tests that
do nothing is to pass, and by not running the decorated function (the
test) it would effectively pass.  Now the tests make sure that the
test method gets executed (and does a cancel when running it).

Signed-off-by: Cleber Rosa <crosa@redhat.com>
This adds a new parameter, which signals how the result of the
condition should be evaluated, that is, if it should be negated.

This makes all the logic available to the decorator, and will be
required later.

Signed-off-by: Cleber Rosa <crosa@redhat.com>
There's currently a bug in cause by the fact that the decorator
function is always marked (__skip_method_decorator__) to be skipped,
even when the condition is supposed to be evaluated later, during
the execution of the decorator.

This has caused the setUp() method to not be executed in certain
cases, and the test method proceeds without finding its environment
set up (common situation when not running setUp(), one ends up with
missing attributes, in the class and cryptic errors during the
execution of the test).

This evaluates the same conditions used for skipping the test, but
also for skipping the setUp() method.  Unfortunately, we have to
distinguish between the condition and the decorator itself because
if we don't, we may end up running the decorated function (usually
the test) twice.

Signed-off-by: Cleber Rosa <crosa@redhat.com>
@clebergnu clebergnu added the bug label Apr 13, 2021
@clebergnu clebergnu added this to the #87 (Braveheart) milestone Apr 13, 2021
@clebergnu clebergnu added this to Review Requested in Avocado Kanban via automation Apr 13, 2021
@codeclimate
Copy link

codeclimate bot commented Apr 13, 2021

Code Climate has analyzed commit 3c3f30d and detected 0 issues on this pull request.

The test coverage on the diff in this pull request is 33.3% (50% is the threshold).

This pull request will bring the total coverage in the repository to 68.2% (0.0% change).

View more on Code Climate.

@clebergnu
Copy link
Contributor Author

This was before this fix:

(10/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_size:  ERROR: 'NextCubeMachine' object has no attribute '_vms' (0.03 s)
 (11/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_ocr_with_tesseract_v3:  SKIP: tesseract v3 OCR tool not available
 (12/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_ocr_with_tesseract_v4:  ERROR: 'NextCubeMachine' object has no attribute '_vms' (0.03 s)

And with this fix:

(10/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_size:  PASS (2.24 s)
 (11/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_ocr_with_tesseract_v3:  SKIP: tesseract v3 OCR tool not available
 (12/36) tests/acceptance/machine_m68k_nextcube.py:NextCubeMachine.test_bootrom_framebuffer_ocr_with_tesseract_v4:  PASS (3.55 s)

@clebergnu clebergnu merged commit 30e274d into avocado-framework:master Apr 13, 2021
Avocado Kanban automation moved this from Review Requested to #87 (Braveheart) Apr 13, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants