Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Writing tests against testdir.runpytest() and local plugin (no tox) #47

Closed
lhayhurst opened this issue Dec 26, 2020 · 6 comments
Closed

Comments

@lhayhurst
Copy link

lhayhurst commented Dec 26, 2020

Hi, I found this project after reading your great Python Testing With Python Book.

After downloading the project locally, I was able to run tox and have all the tests pass. However, when I try to run the tests using a local python3 instance (I'm using an activated conda env), the "integration" style tests that inject in python scripts using the testdir fixture all fail.

For example, I wrote this test trying to reproduce #43 and #44:

def test_mix_of_checks_and_asserts_are_reported_ok(testdir):
    testdir.makepyfile(
        """
        from pytest_check import check

        def test_mix():
            with check:
                assert 1 == 2
            assert 2 == 3    
        """
    )
    result = testdir.runpytest()
    result.assert_outcomes(failed=1, passed=0)
    # code to traverse the test report output and verify that both assertions failed 

When using a local (non tox) pytest (in my case, in the conda envir ./envs/default/bin), for ex

PYTHONPATH=src envs/default/bin/pytest --trace-config tests/test_check_context_manager.py::test_mix

The trace-config output shows me that the pytest-check fixture is being loaded, but the figure isn't being loaded in the test run via testdir.runpytest. Can I get any of the tests that use testdir to pass without using tox, and instead use a project-local pytest instance running via the command line?

Thank you!

@lhayhurst
Copy link
Author

Just reporting a little bit of progress with this, the following test loads the plugin via args into testdir.runpytest, but, interestingly, the test report sees that two tests passed, versus two tests failing.

def test_mix_of_checks_and_asserts_are_reported_ok(testdir):
    testdir.makepyfile(
        """
        from pytest_check import check

        def test_foo():
            with check:
                assert 4 == 5

        def test_mix():
            with check:
                assert 1 == 2
            # assert 2 == 3    
        """
    )
    result = testdir.runpytest('--trace-config', '-ppytest_check')
    result.assert_outcomes(failed=2, passed=0)

PYTHONPATH=src envs/default/bin/pytest tests/test_check_context_manager.py::test_mix_of_checks_and_asserts_are_reported_ok

Outputs:

collected 2 items

test_mix_of_checks_and_asserts_are_reported_ok.py ..                     [100%]

============================== 2 passed in 0.02s ===============================
==================================== short test summary info ====================================
FAILED tests/test_check_context_manager.py::test_mix_of_checks_and_asserts_are_reported_ok - A...
======================================= 1 failed in 0.06s =======================================

The output of the embedded run appears to be a clue: "2 passed in 0.02s" (should be two failed).

@okken
Copy link
Owner

okken commented Dec 27, 2020

@lhayhurst Could you include the top header with versions?
With Python 3.9.1, pytest 6.2.1, on mac, I'm able to run just with pytest:

(pytest-check) $ pytest
============================= test session starts ==============================
platform darwin -- Python 3.9.1, pytest-6.2.1, py-1.10.0, pluggy-0.13.1
rootdir: /Users/okken/projects/pytest-check
plugins: pytest_check-0.4.0
collected 37 items                                                             

tests/test_check.py .........................                            [ 67%]
tests/test_check_context_manager.py ......                               [ 83%]
tests/test_check_errors.py ..                                            [ 89%]
tests/test_check_fixture.py .                                            [ 91%]
tests/test_check_func_decorator.py ...                                   [100%]

============================== 37 passed in 0.97s ==============================

@okken
Copy link
Owner

okken commented Dec 27, 2020

@lhayhurst Also, do you have your local version of pytest-check installed while running the tests?

tox will create a new virtual env, then install the local pytest-check package, then run tests.
Without tox, you have to install pytest-check yourself.

After creating a new virtual environment, from the pytest-check directory, I do pip install -e ..

my pip list and pip freeze looks like this:

(pytest-check) $ pip list --not-required   
Package      Version
------------ -------
pip          20.3.3
pytest-check 0.4.0
setuptools   46.4.0
wheel        0.34.2
(pytest-check) $ pip freeze                
attrs==20.3.0
iniconfig==1.1.1
packaging==20.8
pluggy==0.13.1
py==1.10.0
pyparsing==2.4.7
pytest==6.2.1
pytest-check @ file:///Users/okken/projects/pytest-check
toml==0.10.2

@lhayhurst
Copy link
Author

lhayhurst commented Dec 27, 2020

Hi, I'm pytest 6.2.1 and Python 3.7. I do not pip install local version of pytest-check installed when running the tests -- I'll try that now.

@lhayhurst
Copy link
Author

lhayhurst commented Dec 27, 2020

That did the trick! Thank you!

Also, when I made this change to confest.py's pytest_runtest_makereport, the new test quoted above (that makes sure you get two failures) starts passing. I was just playing around, but if this looks like an ok fix (to #43 and #44), I can submit a PR.

            if report.longrepr:
                report.longrepr.append("\n".join(longrepr))
            else:
                report.longrepr = "\n".join(longrepr)

Feel free to close this ticket out, appreciate your time (and the excellent book!).

@okken
Copy link
Owner

okken commented Dec 27, 2020

The longrepr trick doesn't work on newer versions of pytest.
I'm working on a fix for this issue. It was reported twice already, so I'm closing this.

@okken okken closed this as completed Dec 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants