Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test passes but is reported as failed #59

Open
Diaoul opened this issue Nov 22, 2023 · 8 comments
Open

Test passes but is reported as failed #59

Diaoul opened this issue Nov 22, 2023 · 8 comments

Comments

@Diaoul
Copy link

Diaoul commented Nov 22, 2023

  • The output shows the test passed (with some warnings)
  • Yet the test is reported as failed

Funny thing:

  • When debugging the test, I have the right report status (success)

How can I troubleshoot this?

@Diaoul
Copy link
Author

Diaoul commented Nov 22, 2023

pytest_runtest_makereport is never called.
I'm assuming it's because one of those numerous plugins that don't return None and make the hook stop...

plugins: ddtrace-1.12.3, Flask-Dance-5.1.0, mock-3.10.0, rerunfailures-10.2, Faker-13.15.0, snapshottest-0.6.0, unordered-0.5.1, find-dependencies-0.5.2, anyio-3.7.1, typeguard-2.13.3, subtests-0.8.0, xdist-3.3.1, cov-4.0.0, socket-0.5.1, archon-0.0.5

Do you have any idea how to circumvent that?

@Diaoul
Copy link
Author

Diaoul commented Nov 22, 2023

Actually, seems caused by running pytest with -n 2

@Diaoul
Copy link
Author

Diaoul commented Nov 22, 2023

It's documented here, unsure what the fix could be...

@Diaoul
Copy link
Author

Diaoul commented Nov 22, 2023

I've tested by writing to a file but it does not seem executed at all, even on the slave 🤔
Maybe this is more the one then

@Diaoul
Copy link
Author

Diaoul commented Nov 22, 2023

As a workaround I've created a .pytest.ini which sets -n 0 and overrides the setting from pyproject.toml:

[pytest]
addopts = -n 0

@martinparadiso
Copy link

martinparadiso commented Dec 15, 2023

Hello, I've encountered the same problem this days. Without
running parallel. Minimal working example:

import pytest

@pytest.mark.parametrize('val', [1, 2])
def test_param(val):
    assert val in (1, 2)

The output panel shows success, neotest.log with trace-level shows the following:

TRACE | 2023-12-15T02:50:43Z+0000 | ...hare/nvim/lazy/neotest/lua/neotest/client/state/init.lua:71 | {
  ["/home/martin/test/test_param.py::test_param"] = {
    errors = {},
    output = "/tmp/nvim.martin/sfi7Rf/2",
    status = "failed"
  },
  ["/home/martin/test/test_param.py::test_param[1]"] = {
    errors = {},
    output = "/tmp/nvim.martin/sfi7Rf/2",
    short = "\27[32m\27[1m____________________________________________________ test_param[1] _____________________________________________________\27[0m\n",
    status = "passed"
  },
  ["/home/martin/test/test_param.py::test_param[2]"] = {
    errors = {},
    output = "/tmp/nvim.martin/sfi7Rf/2",
    short = "\27[32m\27[1m____________________________________________________ test_param[2] _____________________________________________________\27[0m\n",
    status = "passed"
  }
}


There is a non-parametrized element (the first one) that
is marked as failed for some reason. I dived a little bit
in the codebase but could not track down where it is generated.

Also, messages in the log file seem to be duplicated.

I have not bisected the git history, but problem is not present in
81d2265, so that can be used in the mean time.

rcarriga added a commit that referenced this issue Dec 20, 2023
Only emits position IDs with parameters when pytest discovery is enabled

See #36 and #59
@rcarriga
Copy link
Collaborator

@martinparadiso You hit a separate issue with parameterized tests that has now been fixed in the latest commit

@Billuc
Copy link

Billuc commented Apr 29, 2024

As a workaround I've created a .pytest.ini which sets -n 0 and overrides the setting from pyproject.toml:

[pytest]
addopts = -n 0

I tested this workaround, it also work if I replace the setting in my pyproject.toml.
Weird though that this parameter is linked to the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants