Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--tb=no should remove tracebacks from xml output #80

Closed
sunshine-syz opened this issue Apr 12, 2022 · 9 comments
Closed

--tb=no should remove tracebacks from xml output #80

sunshine-syz opened this issue Apr 12, 2022 · 9 comments

Comments

@sunshine-syz
Copy link
Contributor

In Pytest you can disable the traceback by --tb=no
https://docs.pytest.org/en/7.0.x/how-to/output.html#modifying-python-traceback-printing

But in pytest_check, we cannot do that. If there are multiple failures in one test, it will be hard to look at the error message with lots of traceback

@okken
Copy link
Owner

okken commented Apr 12, 2022

This is a very good point. Are you interested in working on a fix?

@okken okken added the bug label Apr 12, 2022
@sunshine-syz
Copy link
Contributor Author

I am happy to contribute on this.

pytest has many options for --tb (https://docs.pytest.org/en/7.0.x/how-to/output.html#modifying-python-traceback-printing). Is it necessary to support all of them in pytest-check or just some of them?

@okken
Copy link
Owner

okken commented Apr 12, 2022

I was just trying to reproduce the problem and don't see a problem.

test_foo.py

from pytest_check import check

def test_multiple_failures():
  a = (1, 2, 3)
  b = (3, 2, 1)
  with check:
    assert a == b
  with check:
    assert b == a

Normal multiple failures:

$ pytest test_foo.py        
========================= test session starts ==========================
collected 1 item                                                       

test_foo.py F                                                    [100%]

=============================== FAILURES ===============================
________________________ test_multiple_failures ________________________
FAILURE: assert (1, 2, 3) == (3, 2, 1)
  At index 0 diff: 1 != 3
  Use -v to get more diff
test_foo.py:6 in test_multiple_failures() -> with check:
FAILURE: assert (3, 2, 1) == (1, 2, 3)
  At index 0 diff: 3 != 1
  Use -v to get more diff
test_foo.py:8 in test_multiple_failures() -> with check:
------------------------------------------------------------
Failed Checks: 2
======================= short test summary info ========================
FAILED test_foo.py::test_multiple_failures
========================== 1 failed in 0.02s ===========================

With --tb=no:

$ pytest --tb=no test_foo.py
========================= test session starts ==========================
collected 1 item                                                       

test_foo.py F                                                    [100%]

======================= short test summary info ========================
FAILED test_foo.py::test_multiple_failures
========================== 1 failed in 0.02s ===========================

Isn't this what you were asking for?

@sunshine-syz
Copy link
Contributor Author

sunshine-syz commented Apr 12, 2022

With --tb=no, the console output is correct. But the command I used is pytest --junitxml=output_pytest.xml --tb=no test_foo.py, because I need to output the results and use it to generate html report. In the xml file, the content is wrong

When use pytest_check, with/without --tb=no, the xml content always includes the traceback.

import pytest_check as check

def test_multiple_failures():
  a = (1, 2, 3)
  b = (3, 2, 1)

  check.equal(a, b)

xml content

<?xml version="1.0" encoding="utf-8"?><testsuites><testsuite name="pytest" errors="0" failures="1" skipped="0" tests="1" time="0.047" timestamp="2022-04-12T16:24:07.419257" hostname="yizhous-mbp2"><testcase classname="test_foo" name="test_multiple_failures" time="0.020"><failure message="FAILURE: &#10;assert (1, 2, 3) == (3, 2, 1)&#10;  At index 0 diff: 1 != 3&#10;  Use -v to get the full diff&#10;test_foo.py:8 in test_multiple_failures() -&gt; check.equal(a, b)&#10;------------------------------------------------------------&#10;Failed Checks: 1">FAILURE: 
assert (1, 2, 3) == (3, 2, 1)
  At index 0 diff: 1 != 3
  Use -v to get the full diff
test_foo.py:8 in test_multiple_failures() -&gt; check.equal(a, b)
------------------------------------------------------------
Failed Checks: 1</failure></testcase></testsuite></testsuites>

When just use assert, with--tb=no, the xml content won't include the traceback info.

def test_multiple_failures():
  a = (1, 2, 3)
  b = (3, 2, 1)

  assert a == b

xml content

<?xml version="1.0" encoding="utf-8"?><testsuites><testsuite name="pytest" errors="0" failures="1" skipped="0" tests="1" time="0.024" timestamp="2022-04-12T16:13:42.053330" hostname="mbp2"><testcase classname="test_foo" name="test_multiple_failures" time="0.001"><failure message="assert (1, 2, 3) == (3, 2, 1)&#10;  At index 0 diff: 1 != 3&#10;  Use -v to get the full diff">E   assert (1, 2, 3) == (3, 2, 1)
      At index 0 diff: 1 != 3
      Use -v to get the full diff</failure></testcase></testsuite></testsuites>

@okken
Copy link
Owner

okken commented Apr 13, 2022

Ah. Ok.
Regarding "Is it necessary to support all of them in pytest-check or just some of them?", let's start with just supporting --tb=no and take it from there.

@sunshine-syz
Copy link
Contributor Author

PR created. #81

@okken
Copy link
Owner

okken commented Apr 20, 2022

Thank you. I’ll take a look later this week.

@okken okken added enhancement and removed bug labels Aug 21, 2022
@okken
Copy link
Owner

okken commented Aug 21, 2022

Changing title from "Cannot disable the traceback in the error message"
to "--tb=no should remove tracebacks from xml output"

@okken okken changed the title Cannot disable the traceback in the error message --tb=no should remove tracebacks from xml output Aug 21, 2022
@okken
Copy link
Owner

okken commented Sep 29, 2022

fixed by version 1.0.10

@okken okken closed this as completed Sep 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants