Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding pytest style custom markers to Behave code ? #1129

Closed
TheJester12121 opened this issue Aug 20, 2023 · 7 comments
Closed

Adding pytest style custom markers to Behave code ? #1129

TheJester12121 opened this issue Aug 20, 2023 · 7 comments
Labels

Comments

@TheJester12121
Copy link

When using pytest, I can make pytest aware of special marker.

[pytest] markers = test_id:The id of the test as known in requirements management system.

This marker is then included in the junit output when tests are executed when started like this.

pytest -rA --show-capture=all --junitxml=./test-run-results.xml

Now, when i create the test infrastructure using Behave, i would like to do something similar.
The Junit output is already available so that's nice.
But i cannot find an option in the docs to register this with Behave to make this info available in the junit xml file.

I found some info in doing it from the Allure report, but I'm not sure that's the way to go.
That info is here :

Did i miss anything ?

@jenisys
Copy link
Member

jenisys commented Aug 20, 2023

  • I assume, you want the tags data as testcase.property (by split-up of tag into name = value )
  • If you need something else, say so. But this essential information how you want to transport your information is not described above.

OTHERWISE:

  • You can easily post-process the JUnit XML output and rewrite it by using various Python modules that support such a functionality (examples: junitparser, junit-xml, …)
  • You can add another Formatter that provides JUnitXML output (may reuse the JUnitReporter class)

@jenisys
Copy link
Member

jenisys commented Aug 20, 2023

I just reevaluate pytest markers with JUnitXML output (by using: pytest -rA --show-capture=all ...):

  • pytest only adds information to JUnit XML for known markers, like: skip, xfail, ... (in: <skipped> XML element)
  • pytest does not add any marker related information for user-defined markers for passing tests
  • pytest shows user-defined markers for failing tests in <failure> XML element (as marker-decorators)

THEREFORE:

  • Specify where you expect that information should be added to the JUnit XML report (and which info) !

NOTES:

  • behave: Scenario tags are already shown in CDATA part of JUnit XML output for newer behave versions (like: behave v1.2.7.dev5)

@TheJester12121
Copy link
Author

Thanks

First of all, thanks for taking the time to check with me.

As per your second reply:

Not sure we're talking about the same thing, but what i meant is shown below.

We register the new marker, by adding it to a pytest.ini which is in the folder that contains the test file and is automatically parsed by pytest when started.

tj@tj-ubuntu:~/test$ 
tj@tj-ubuntu:~/test$ cat pytest.ini 
[pytest]
markers =
    test_id:The id of the test as known in RMS.
   
tj@tj-ubuntu:~/test$ 

Then the test code:

tj@tj-ubuntu:~/test$ cat test.py 
import pytest

@pytest.mark.test_id("839")
def test_1():
    assert 12 == 12
tj@tj-ubuntu:~/test$ 

Execution and manual formatted output:

tj@tj-ubuntu:~/test$ pytest -rA --show-capture=all --junitxml=./test-run-results.xml test.py 
== test session starts ==
platform linux -- Python 3.11.4, pytest-7.2.1, pluggy-1.0.0+repack
rootdir: /home/tj/test, configfile: pytest.ini
collected 1 item                                                                                                                                     

test.py .                                                                                                                                      [100%]

== PASSES ==
-- generated xml file: /home/tj/test/test-run-results.xml --
== short test summary info ==
PASSED test.py::test_1
== 1 passed in 0.01s ==
tj@tj-ubuntu:~/test$ 
tj@tj-ubuntu:~/test$ 

And the final result (with manual formatted xml output):

tj@tj-ubuntu:~/test$ 
tj@tj-ubuntu:~/test$ cat ./test-run-results.xml 
<?xml version="1.0" encoding="utf-8"?>
<testsuites>
<testsuite name="pytest" errors="0" failures="0" skipped="0" tests="1" time="0.017" timestamp="2023-08-20T23:03:56.208288" hostname="tj-ubuntu">
<testcase classname="test" name="test_1" time="0.000">
<properties>
<property name="test_id" value="839" />
</properties>
</testcase>
</testsuite>
</testsuites>
tj@tj-ubuntu:~/test$ 
tj@tj-ubuntu:~/test$ 

As per your first reply:

Somehow i need to be able to get to know which tests failed and which tests passed so that i can pass along that info the requirements management system. In the above example, i can add the marker (in the code) to get to know that because the pass/fail are in the xml, together with the marker id. After that's executed, there is another python script (call it, my notifier) that takes up that xml output and goes through the list of results and tells the RMS of the results by passing it the result together with the id.

If the TAG can be added to the tests in the Behave formatted/style Python code, and that TAG is included in whatever XML block inside the XML output file, then i'm all set and i just have to include parsing of that block in the notifier.

I will check how the TAG can be used and see if it ends up in the XML.

Thanks for now. Cheers!

@jenisys
Copy link
Member

jenisys commented Aug 21, 2023

Thanks, but in pytest:

  • the markers list is only use for strict markers checking and CLI diagnostics (IMHO)
  • My POC with a test_example.py with a marker-decorator @pytest.mark.test_id(123) did not lead to a property in the JUnitXML file (same for a similar marker-decorator that had a string-arg)
  • Only if I use record_property() which is DEPRECATED for xuint2 dialect, I got a property as part of the <testcase> XML element
  • USING: pytest v7.4.0 with Python 3.10 (and test_id in the markers list param in the config-file)

RELATED TO: behave

  • To get the equivalent what you want to achieve in behave, you need a Scenario.record_property() method (as simplified functionality)
  • The JUnit reporter can access the recorded properties and store them as testcase properties in the JUnitXML file
  • Other formatters can access the recorded properties and store/provide them in the output format in a sensible way
  • THEN: You can use before_tag() hook, to use ctx.scenario.record_property() to store the related information (for some properties)
  • TAG EXAMPLES: @test_id=123 or @test_id:123 or @test_id.123 or …

RELATED TO: allure test reports

@jenisys
Copy link
Member

jenisys commented Aug 21, 2023

HINT:
The core functionality of Scenario.record_property() (or similar) is already covered by the the following feature request: #1112

@TheJester12121
Copy link
Author

Your remark on the record_property in pytest remembered me that you indeed need to call that for the specific properties to end up in the xml output, sorry i forgot that.

tj@tj-ubuntu:~/test$ cat conftest.py 
import pytest

def pytest_collection_modifyitems(session, config, items):
    for item in items:
        for marker in item.iter_markers(name="test_id"):
            test_id = marker.args[0]
            item.user_properties.append(("test_id", test_id))
tj@tj-ubuntu:~/test$ 
tj@tj-ubuntu:~/test$ 

@TheJester12121
Copy link
Author

Ok, doing this:

Feature: showing off behave

  @test_id=13
  Scenario: run a simple test on an ENC
     Given we have a new test board as ENC
      When we implement a test
      Then behave will test it for us!

results in

<testsuite name="tutorial.showing off behave" tests="1" errors="0" failures="0" skipped="0" time="0.000384" timestamp="2023-08-22T08:47:30.168057" hostname="tj-ubuntu">

<testcase classname="tutorial.showing off behave" name="run a simple test on an ENC" status="passed" time="0.000189">
<system-out>
<![CDATA[
@scenario.begin

  **@test_id=13**
  Scenario: run a simple test on an ENC
    Given we have a new test board as ENC ... passed in 0.000s
    When we implement a test ... passed in 0.000s
    Then behave will test it for us! ... passed in 0.000s

@scenario.end
--------------------------------------------------------------------------------

Captured stdout:
before step
after step
before step
after step
before step
after step

]]>
</system-out>
</testcase>
</testsuite>

Is that what you meant ? I can add parsing of this CDATA structure in my Notifier.
Or did you suggest something else?

@jenisys jenisys closed this as completed Aug 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants