Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] schema decorator does not play nice with pytest marks #1378

Closed
2 tasks done
schicks opened this issue Feb 2, 2022 · 1 comment · Fixed by #1441
Closed
2 tasks done

[BUG] schema decorator does not play nice with pytest marks #1378

schicks opened this issue Feb 2, 2022 · 1 comment · Fixed by #1441
Assignees
Labels
Status: Needs Triage Requires initial assessment to categorize and prioritize Type: Bug Errors or unexpected behavior

Comments

@schicks
Copy link

schicks commented Feb 2, 2022

Checklist

Describe the bug
Because schemathesis can be quite slow and requires a testing version of the API to be set up, we have it behind pytest.mark.acceptance to avoid running it with unit tests. Unfortunately, the schema initializes when we set the schema variable, rather than when the test runs. To get around that, we use schemathesis.from_pytest_fixture to get the schema. When the mark.acceptance is above schema.parametrize, the tests hang forever. When it is below schema.parametrize, the schema is evaluated and errors because we don't have a test API running.

To Reproduce
Steps to reproduce the behavior:

@pytest.fixture
def web_app():
    return schemathesis.from_uri(...)

schema = schemathesis.from_pytest_fixture("web_app")

@pytest.mark.acceptance
@schema.parametrize()
def test_schema(case):
    ...

When this is run, tests will hang forever. When the decorators on the test are reversed and the test API is not running, tests will fail because the test API is not running.

This does not depend on the API schema, since the whole point is that the schema is not running.

Expected behavior
When the acceptance mark is deselected in the example above, the tests should pass. When the acceptance mark is selected in the example above, the tests should not stall out (and also, eventually, pass, as long as the test API is running in that context.)

Environment (please complete the following information):

  • OS: Debian docker container
  • Python version: 3.8.5
  • Schemathesis version: 3.12.3
  • Spec version: NA

Additional context
Any other workaround that allows us to skip this test with marks in a sane way would also be a welcome solution.

@schicks schicks added Status: Needs Triage Requires initial assessment to categorize and prioritize Type: Bug Errors or unexpected behavior labels Feb 2, 2022
Stranger6667 added a commit that referenced this issue Apr 11, 2022
…chema.parametrize` if the schema is created via `from_pytest_fixture`

Ref: #1378
@Stranger6667
Copy link
Member

Hi @schicks

Sorry for not getting back to you earlier. What is the pytest version you're using? I created a similar test file and can't reproduce the behavior - the web_app fixture is not evaluated, and the relevant test is skipped as expected when I use -m "not acceptance":

import pytest

import schemathesis


@pytest.fixture
def web_app():
    1 / 0


schema = schemathesis.from_pytest_fixture("web_app")


@pytest.mark.acceptance
@schema.parametrize()
def test_schema(case):
    1 / 0

Deselecting:

pytest -m "not acceptance" example.py
=================================== test session starts ===================================
platform linux -- Python 3.10.4, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /tmp
plugins: mock-1.13.0, schemathesis-3.13.3, httpserver-1.0.3, xdist-1.34.0, forked-1.4.0, hypothesis-6.41.0, anyio-3.4.0, subtests-0.5.0, asyncio-0.11.0
collected 1 item / 1 deselected                                                           

==================================== warnings summary =====================================
example.py
  /tmp/example.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.acceptance - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @pytest.mark.acceptance

-- Docs: https://docs.pytest.org/en/stable/warnings.html
============================ 1 deselected, 1 warning in 0.01s =============================

Selecting:

 pytest -m "acceptance" /tmp/example.py 
=================================== test session starts ===================================
platform linux -- Python 3.10.4, pytest-6.2.5, py-1.11.0, pluggy-1.0.0
rootdir: /tmp
plugins: mock-1.13.0, schemathesis-3.13.3, httpserver-1.0.3, xdist-1.34.0, forked-1.4.0, hypothesis-6.41.0, anyio-3.4.0, subtests-0.5.0, asyncio-0.11.0
collected 1 item                                                                          

example.py E                 [100%]

========================================= ERRORS ==========================================
______________________________ ERROR at setup of test_schema ______________________________

    @pytest.fixture
    def web_app():
>       1 / 0
E       ZeroDivisionError: division by zero

/tmp/example.py:8: ZeroDivisionError
==================================== warnings summary =====================================
example.py:14
  /tmp/example.py:14: PytestUnknownMarkWarning: Unknown pytest.mark.acceptance - is this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/stable/mark.html
    @pytest.mark.acceptance

-- Docs: https://docs.pytest.org/en/stable/warnings.html
================================= short test summary info =================================
ERROR ...
=============================== 1 warning, 1 error in 0.08s ===============================

Environment:

  • Python 3.10.4
  • pytest 6.2.5
  • latest available Schemathesis version

Though, when the decorators' order is reversed, they are indeed ignored :( It should be solved by #1441

Am I missing something?

Stranger6667 added a commit that referenced this issue Apr 11, 2022
…chema.parametrize` if the schema is created via `from_pytest_fixture`

Ref: #1378
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Status: Needs Triage Requires initial assessment to categorize and prioritize Type: Bug Errors or unexpected behavior
Projects
None yet
2 participants