Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Junit XML reporter #5

Closed
philsquared opened this issue Nov 19, 2010 · 76 comments
Closed

Junit XML reporter #5

philsquared opened this issue Nov 19, 2010 · 76 comments

Comments

@philsquared
Copy link
Collaborator

It should be possible to obtain xml reports that follow the schema for JUnit - for consumption by third party tools

@philsquared
Copy link
Collaborator Author

First cut of this has been committed, but it's not fully tested yet and more examples of real JUnit output would be useful

@wichert
Copy link
Contributor

wichert commented Apr 24, 2011

Jenkins does not like the current output:

ERROR: Publisher hudson.tasks.junit.JUnitResultArchiver aborted due to exception
java.lang.NullPointerException
    at hudson.tasks.junit.CaseResult.getPackageName(CaseResult.java:266)
    at hudson.tasks.junit.TestResult.tally(TestResult.java:500)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:115)
    at hudson.tasks.junit.JUnitParser$ParseResultCallable.invoke(JUnitParser.java:87)
    at hudson.FilePath.act(FilePath.java:757)
    at hudson.FilePath.act(FilePath.java:739)
    at hudson.tasks.junit.JUnitParser.parse(JUnitParser.java:83)
    at hudson.tasks.junit.JUnitResultArchiver.parse(JUnitResultArchiver.java:123)
    at hudson.tasks.junit.JUnitResultArchiver.perform(JUnitResultArchiver.java:135)
    at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
    at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:649)
    at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:625)
    at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:603)
    at hudson.model.Build$RunnerImpl.post2(Build.java:161)
    at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:572)
    at hudson.model.Run.run(Run.java:1386)
    at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
    at hudson.model.ResourceController.execute(ResourceController.java:88)
    at hudson.model.Executor.run(Executor.java:145)
Finished: FAILURE

Here is an exampe of a valid junit output as generated by nose:

<?xml version="1.0" encoding="UTF-8"?>
<testsuite name="nosetests" tests="175" errors="0" failures="0" skip="0">
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testBackwardAccess" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testBackwardDeletion" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testDeletion" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testForwardAccess" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testItemAccessor" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testLength" time="0"/>
  <testcase classname="pyrad.tests.testBidict.BiDictTests" name="testStartEmpty" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testNamedParameters" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testClient.ConstructionTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testClient.OtherTests" name="testCreateAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.OtherTests" name="testCreateAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testAuthDelay" time="2"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testBind" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testBindClosesSocket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testDoubleAccountDelay" time="3"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testDoubleRetry" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testIgnorePacketError" time="1"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testInvalidReply" time="1"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testNoRetries" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testReopen" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSendPacket" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSingleAccountDelay" time="2"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testSingleRetry" time="0"/>
  <testcase classname="pyrad.tests.testClient.SocketTests" name="testValidReply" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testConstructionParameters" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testInvalidDataType" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testNamedConstructionParameters" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.AttributeTests" name="testValues" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testContainment" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testEmptyDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryInterfaceTests" name="testReadonlyContainer" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeEncryptionError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeOptions" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeUnknownTypeError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testAttributeUnknownVendorError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorTooFewColumns" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testBeginVendorUnknownVendor" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testDictFileParseError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testDictFilePostParse" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorUnbalanced" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testEndVendorUnknownVendor" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testInclude" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testIntegerValueParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseEmptyDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseMultipleDictionaries" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testParseSimpleDictionary" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testStringValueParsing" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testValueForUnknownAttributeError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testValueTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVenderTooFewColumnsError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorFormatError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorFormatSyntaxError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorOptionError" time="0"/>
  <testcase classname="pyrad.tests.testDictionary.DictionaryParsingTests" name="testVendorParsing" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testNamedParameters" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testHost.ConstructionTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreateAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreateAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketCreationTests" name="testCreatePacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketSendTest" name="testSendPacket" time="0"/>
  <testcase classname="pyrad.tests.testHost.PacketSendTest" name="testSendReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorDefaults" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorRawPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testRequestPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testRequestPacketSetsId" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AcctPacketTests" name="testVerifyAcctRequest" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorDefaults" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptEmptyPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwCryptSetsAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwDecryptEmptyPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testPwDecryptPassword" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacketCreatesAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.AuthPacketTests" name="testRequestPacketCreatesID" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testBasicConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructWithDictionary" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructorIgnoredParameters" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testConstructorWithAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketConstructionTests" name="testNamedConstructor" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAddAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testAttributeValueAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testCreateAuthenticator" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testCreateReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithBadAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithEmptyAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithEmptyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithInvalidLength" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithMultiValuedAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithPartialAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithTooBigPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithTwoAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithVendorAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDecodePacketWithoutAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testDelItem" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testEncodeKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testEncodeKeyValues" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testGenerateID" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testHasKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testHasKeyWithUnknownKey" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testKeys" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktDecodeVendorAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktEncodeAttribute" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testPktEncodeAttributes" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testRawAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testVendorAttributeAccess" time="0"/>
  <testcase classname="pyrad.tests.testPacket.PacketTests" name="testVerifyReply" time="0"/>
  <testcase classname="pyrad.tests.testPacket.UtilityTests" name="testGenerateID" time="0"/>
  <testcase classname="pyrad.tests.testProxy.OtherTests" name="testProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testProxy.OtherTests" name="testProcessInputNonProxyPort" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHHandleProxyPacketHandlesWrongPacket" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHandleProxyPacketSetsSecret" time="0"/>
  <testcase classname="pyrad.tests.testProxy.ProxyPacketHandlingTests" name="testHandleProxyPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testProxy.SocketTests" name="testProxyFd" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testServer.AcctPacketHandlingTests" name="testHandleAcctPacketWrongPort" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacketUnknownHost" time="0"/>
  <testcase classname="pyrad.tests.testServer.AuthPacketHandlingTests" name="testHandleAuthPacketWrongPort" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testAcctProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testAuthProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.OtherTests" name="testCreateReplyPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.RemoteHostTests" name="testNamedConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.RemoteHostTests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testBindDuringConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testParameterOrder" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerConstructiontests" name="testSimpleConstruction" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresPacketErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresPollErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunIgnoresServerPacketErrors" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunInitializes" time="0"/>
  <testcase classname="pyrad.tests.testServer.ServerRunTests" name="testRunRunsProcessInput" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testBind" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testGrabPacket" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketAcctFds" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketAuthFds" time="0"/>
  <testcase classname="pyrad.tests.testServer.SocketTests" name="testPrepareSocketNoFds" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testAddressDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testAddressEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDateDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDateEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testDecodeFunction" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testEncodeFunction" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testIntegerDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testIntegerEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidAddressEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidDataEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidIntegerEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testInvalidStringEncodingRaisesTypeError" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testStringDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testStringEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnknownTypeDecoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnknownTypeEncoding" time="0"/>
  <testcase classname="pyrad.tests.testTools.EncodingTests" name="testUnsignedIntegerEncoding" time="0"/>
</testsuite>

@wichert
Copy link
Contributor

wichert commented Apr 24, 2011

This stackoverflow discussion might have useful information.

@philsquared
Copy link
Collaborator Author

Thanks @wichert. I'm going to have to have another look at this.
I did see that stackoverflow question (and all that it links to) when I was originally looking into it.
I think the only way I'm going to get this working is to install an instance of Jenkins and/ or Hudson myself

@wichert
Copy link
Contributor

wichert commented Apr 26, 2011

If you need another example: I put some zope testrunner output online as well. One thing that testrunner is does create a separate file for each source file containing tests so you end up with lots of XML files.

@wichert
Copy link
Contributor

wichert commented Sep 21, 2011

Can you provide a status update for this ticket?

@philsquared
Copy link
Collaborator Author

I'm really sorry, Wichert. I managed to drop this somewhere along the line.
I'm pretty snowed at the moment so I don't know when I will be able to get back to it.
Will do my best.
If you fancy looking into it yourself you'd be very welcome, of course :-)

@wichert
Copy link
Contributor

wichert commented Sep 22, 2011

I'll give it a try. I was still using an older version of Catch, so I'll have to test upgrading first. The first thing I see is lots of new warnings produced by catch and a compile error. so it doesn't appear to be a trivial upgrade.

@wichert
Copy link
Contributor

wichert commented Sep 23, 2011

I think we need to think a bit about how to structure the output. As an example lets assume tests that are setup like this:

TEST_CASE("ClassA", "Short description of class A") {
    SECTION("methodOne", "Tests for method one") {
        SECTION("situation-1", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
         SECTION("situation-2", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
    }

    SECTION("methodTwo", "Tests for method two") {
        SECTION("situation-1", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
         SECTION("situation-2", "What happens if XYZ") {
            REQUIRE(...);
            REQUIRE(...);
            REQUIRE(...);
        }
    }
}

TEST_CASE("ClassB", "Short description of class B") {
    SECTION("methodOne", "Tests for method one") {
        // Repeat similar structure as for classA

when reporting results for these tests in JUnit format we run into one problem: Catch supports arbitrary nesting of sections, while JUnit only supports two levels (testsuite -> testcase). I suggest that the simplest thing to do here is to only take the two top levels of Catch (TEST_CASE and top level SECTION). That results in this JUnit structure:

<testsuites>
  <testsuite name="ClassA">
    <testcase name="Tests for method one" classname="methodOne" />
    <testcase name="Tests for method two" classname="methodTwo" />
   </testsuite>
  <testsuite name="ClassB">
    <testcase name="Tests for method one" classname="methodOne" />
    <testcase name="Tests for method two" classname="methodTwo" />
   </testsuite>
</testsuites>

(ignoring the mandatory attributes such as time, tests, failures, etc.). Looking at the data passed to a Catch reporter this does not match correctly: the top level there is a group, but I never see more than one group being generated. I'm not sure if that is due to me not knowing how to make groups, if that is an not fully implemented feature in Catch or a design flaw (unused group level).

Can you provide some guidance how I should proceed with this?

@wichert
Copy link
Contributor

wichert commented Nov 10, 2011

Hi Phil. Can you spare a few minutes to give me a few tips so I can try to fix this?

@wichert
Copy link
Contributor

wichert commented Mar 13, 2012

Hi Phil. Is this still on your radar?

@philsquared
Copy link
Collaborator Author

Sorry Wichert, I've still not really caught up.
Hoping to do a big push soon…

On 13 Mar 2012, at 09:05, Wichert Akkerman wrote:

Hi Phil. Is this still on your radar?


Reply to this email directly or view it on GitHub:
#5 (comment)

@wichert
Copy link
Contributor

wichert commented Aug 28, 2012

Hi Phil. I'm still willing to poke at this, but need some input from you. Is this issue still on your radar?

@philsquared
Copy link
Collaborator Author

Hey Wichert,

Thanks for your patience. I'm really sorry I've not been getting back to this - and thanks for your help.
This has not dropped off my radar - but I'm trying to get through some work that will impact the reporter interface before I do too much with the JUnit reporter. I'm aiming to get that done in the next week or so.

As for your the question you raised before (sorry I didn't respond before - that was inbox syndrome):

The way I have been approaching it is that each isolated test case run (one for each leaf section) is a JUnit "test". The name can be constructed from the test case name + section name(s) (I need to formalise that naming scheme too).

JUnit test suites are then each set of tests that are matched by the filters provided by a single -t switch on the command line (if no -t is provided then there is just a single suite). Bear in mind that -t can have multiple filters. So, e.g:

-t abc/def* ghi/jkl* -t random/1 random/2

Gives us two suites - one named, "abc/def* ghi/jkl*", and one named, "random/1 random/2".
I have support for naming suites coming too - and this is all being captured in the TestCaseFilters class and will be passed on to the reporter (it's not, yet).

Does that make sense?

@wichert
Copy link
Contributor

wichert commented Aug 30, 2012

This matches the naming scheming, which I admint I have never understood in catch. I am always struggling with two things:

  • why should I have to provide a description for test cases and sections? I already use descriptive names, so the description is often the exact same string. Having to provide it is just annoying.
  • how do test case names map to the commandline option for the testrunner? I would expect that if I have a test case named "foo" with a nested testcase named "bar", which contains a section named "buz" I can use options like -t foo, -t foo/bar or -t foo/bar/buz . That never worked, at which point I always gave up trying to use the -t option.

That aside there is a problem with junit XML support: Catch supports nested test cases which does not map to the junit scheme which only supports three levels: package, class and test. Trying force Catch structure into that I would expect something like this:

  • each top level test case is a package
  • each second level test case is a class. If no second level is used insert a dummy level?
  • each section is a test

or alternatively:

  • each source file is a package
  • each top level test case is a class
  • each section is a test

The latter matches python (and I'm guessing java) better, but I am not sure if you can get the source filename in a useful way.

@philsquared
Copy link
Collaborator Author

The idea is that the test name is a short, hierarchical, name - something like: "stuff/sub stuff/details" - so all "stuff" tests can be grouped, and all "stuff/sub stuff" tests can be grouped.
You are not forced to work that way (and I don't always) but it's a useful convention.
If you do that then its nice to be able to supply a more detailed description string alongside it too. But I have to admit I use the description string a lot less than I expected to - a lot of the time I leave it to "" (you don't have to put anything the string - but, due to C++98 not having variadic macros you must provide at least the empty string).

Same with sections, although the hierarchical part is less common as they are already in a hierarchy - so I use descriptions even less there.

If you have used hierarchical test case naming (ignore sections for a moment) then you might have tests like:

"a/b/c"
"a/b/d"
"a/e/f"
"g"

Now you can run the first three with "a*", the first two with "a/b*" or just the first one with "a/b/c"

The idea has always been that this ability would seamlessly extend to sections too - so a section, "1" in the first test case, could be run with "a/b/c/1".
But I have not yet implemented that - and there are some subtleties to it that make me wonder if it would work that way.

Nonetheless, the ability to create groups on-the-fly using wildcards (especially with hierarchical naming) has been useful - and has got even more useful just recently as I have added prefix wildcards (*foo*) and exclusions (exclude:foo - or ~foo) - which can also be mixed (~*foo*). Additionally you can supply a series of these filters and the group will be the union of the inclusions, less the union of the exclusions.

So a command line like:

-t foo/bar* -t a* ~*b*

Would run two groups. The first is "foo/bar*" and will match anything that starts with foo/bar.
The second is "a* ~*b*" and will match anything that starts with a, except anything that contains b.

Two features that are coming: named groups and tags (non hierarchical matching) should make this approach more powerful.

So my scheme was to map groups to Junit suites.

@wichert
Copy link
Contributor

wichert commented Sep 1, 2012

What I am struggling with is the need to prefix fhings manually. To
illustrate this is how I usually structure my tests:

TEST_CASE("MyClass") {
     TEST_CASE("SomeMethod") {
         SECTION("situation-one") {
             ....
            }
         SECTION("situation-two") {
             ....
            }
     }
}

I would expect to be able to tell Catch to run tests for MyClass using
"-t MyClass", or just the tests for MyClass::SomeMethod using "-t
MyClass/SomeMethod", or even a specific situation using "-t
MyClass/SomeMethod/situation-one". I do not see why I should be forced
to manually repeat the hierarchy structure in the test case name as
Catch is currently forcing me to do:

TEST_CASE("MyClass") {
     TEST_CASE("MyClas/SomeMethod") {
         SECTION("MyClass/SomeMethod/situation-one") {
             ....
            }
         SECTION("MyClass/SomeMethod/situation-two") {
             ....
            }
     }
}

@philsquared
Copy link
Collaborator Author

I'm not sure where the outer TEST_CASE is coming from.
TEST_CASEs can appear at only one level (they are implemented as free functions).
SECTIONS may be arbitrarily nested within TEST_CASEs (they are implemented as if statements with scoped objects).

It is true the test cases that logically belong together require a common prefix for the hierarchical matching to work. But it's precisely that prefixing that logically groups them together.

Sections do not require prefixes, however. So I believe your example would be written something like:

TEST_CASE("MyClass/SomeMethod", "") {
    SECTION("situation-one", "") {
        ....
    }
    SECTION("situation-two", "") {
        ....
    }
}

Which is not so bad.

Now you can run all MyClass tests with:

-t MyClass*

And everything in that first test case with:

-t MyClass/SomeMethod

Unfortunately, at time of writing, you cannot selectively run Sections within a test case. There are some issues around this that make it not straightforward, but I believe I should be able to get it working to some approximation (the main issue is that discovery of sections only occurs as the test case is running).

Does that clarify anything at all? Is it the section selection that you are particularly missing?

@wichert
Copy link
Contributor

wichert commented Sep 3, 2012

I misremembered by code - I was nesting SECTIONs instead of TEST_CASEs. That makes my code look like this:

TEST_CASE("MyClass") {
     SECTION("SomeMethod") {
         SECTION("situation-one") {
             ....
         }
         SECTION("situation-two") {
             ....
         }
     }
}

I still do not see why I need to repeat a prefix in the section name when I already create an explicit nesting level in the code. Is there no way to avoid that?

@wichert
Copy link
Contributor

wichert commented Sep 3, 2012

Section selection is certainly something that I am missing. Especially when debugging I often want to run a single (leaf) section so I can test for unintended side-effects and side breakpoints easily without having to worry about other tests triggering them.

@TypicalFooBar
Copy link

Hey, I just wanted to chime in here on this conversation of two years :)

I experienced the same error in Jenkins that you were experiencing Wichert. The NullPointerException because of the output format of the JUnit reporter.

I was able to figure out why Jenkins is crashing, and I have an ugly workaround that lets Jenkins continue to run, but it doesn't solve this issue.

Currently, in the JunitReporter class, there is a function that looks like this:

void OutputTestCases( XmlWriter& xml, const Stats& stats ) {
            std::vector<TestCaseStats>::const_iterator it = stats.m_testCaseStats.begin();
            std::vector<TestCaseStats>::const_iterator itEnd = stats.m_testCaseStats.end();
            for(; it != itEnd; ++it ) {
                xml.writeBlankLine();
                xml.writeComment( "Test case" );

                XmlWriter::ScopedElement e = xml.scopedElement( "testcase" );
                xml.writeAttribute( "classname", it->m_className );
                xml.writeAttribute( "name", it->m_name );
                xml.writeAttribute( "time", "tbd" );

                OutputTestResult( xml, *it );
            }

Using the above, the output from the JunitReporter looks like the following:

<testsuites>
  <testsuite errors="0" failures="0" tests="1" hostname="tbd" time="tbd" timestamp="tbd">

    <!--Test case-->
    <testcase name="SimpleTest" time="tbd"/>
  </testsuite>
  <system-out/>
  <system-err/>
</testsuites>

After some trial and error, I found that what was making Jenkins crash was the fact that the testcase element did NOT have the classname attribute. This is the line of code in the OutputTestCases() function of the JunitReporter class that causes the trouble:

xml.writeAttribute( "classname", it->m_className );

In my case, this line was not actually printing the classname. Perhaps it->m_className is null? When I switched out that line for a random string of "foo", it printed. The modified OutputTestCases() function looks like this:

void OutputTestCases( XmlWriter& xml, const Stats& stats ) {
            std::vector<TestCaseStats>::const_iterator it = stats.m_testCaseStats.begin();
            std::vector<TestCaseStats>::const_iterator itEnd = stats.m_testCaseStats.end();
            for(; it != itEnd; ++it ) {
                xml.writeBlankLine();
                xml.writeComment( "Test case" );

                XmlWriter::ScopedElement e = xml.scopedElement( "testcase" );
                xml.writeAttribute( "classname", "foo" );
                xml.writeAttribute( "name", it->m_name );
                xml.writeAttribute( "time", "tbd" );

                OutputTestResult( xml, *it );
            }
        }

With the above change the JunitReporter output looks like this:

<testsuites>
  <testsuite errors="0" failures="0" tests="1" hostname="tbd" time="tbd" timestamp="tbd">

    <!--Test case-->
    <testcase classname="foo" name="SimpleTest" time="tbd"/>
  </testsuite>
  <system-out/>
  <system-err/>
</testsuites>

Jenkins seems to be okay with this, though it displays the SimpleTest result as if it were a part of the class foo, which is the ugly part I was talking about (but at least it didn't crash!).

I hope this helps you debug the JunitReporter. Great work on Catch Phil! It is VERY easy to use, which I really appreciate! :)

@philsquared
Copy link
Collaborator Author

I hadn't realised the missing class name attribute was the only thing (or, at least, the key thing) stopping this from working! (I've still not had a chance to get a Jenkins/ Hudson set-up going to try it for myself).

I've plumbed that attribute in now (it was always using an empty string before).
For test cases that are actually based on methods it will use the class name. For free standing test cases it just uses the string, "global". I'm open to suggestions on making that more useful. But, from the sounds of it, that should unblock you from needing your workaround and at least be a smoother experience.

I realise there are some other attributes that are still set to "tbd". I don't know how important they are to getting this working too.

I've committed those changes to the Interation branch. Would appreciate if you (both) could let me know how that works for you.

@philsquared
Copy link
Collaborator Author

@wichert I noticed, while replying above, that our previous discussion never fully concluded. But I was bit confused about this:

"I still do not see why I need to repeat a prefix in the section name when I already create an explicit nesting level in the code. Is there no way to avoid that?"

I don't see any repeated prefix in the last code you posted.

@wichert
Copy link
Contributor

wichert commented Nov 5, 2012

@philsquared are you mixing up this discussion and another issue where we talked about how -s behaves?

@philsquared
Copy link
Collaborator Author

No, it was part of this thread:

Hard link here: #5 (comment)

@wichert
Copy link
Contributor

wichert commented Nov 5, 2012

I have to admit I don't remember exactly what my thoughts were at the time. Some current thoughts:

  • there is a prefix involved here: you use MyClass as a prefix (through the MyClass* glob.
  • being able to run individual sections within a test case is extremely useful
  • this should also work if those sections are nested
  • perhaps filtering on sections should be a separate commandline option (Python's zope.testrunner uses the -t parameter to do that, nosetests uses --tests). I'm not sure how that will work with nested sections; you may need to define a separating character and advice people not to use that in test names

@TypicalFooBar
Copy link

Phil,

I just used the catch.hpp file in the Integration branch and Jenkins was quite happy with the output! I tested it with different successful and unsuccessful test cases, and it seems to be working just fine.

Thank you for taking time to fix this problem! The other tbd values still need work, but at least Jenkins can read and display the output from the JunitReporter.

Thanks again for your help, and again great job on Catch!

@philsquared
Copy link
Collaborator Author

That's great! Thanks for letting me know, Derek.
At this point, since we have at least seen it running now, and due to this thread going on a bit - mostly on peripheral issues - I'm going to close this issue.

@wichert - if it still doesn't work at all for you please reopen.

For any othe JUnit related issues now please raise a new issue.

@wichert
Copy link
Contributor

wichert commented Nov 22, 2012

Sorry about that. I've adjusted the permissions so you should be able to access it now.

@SebDyn
Copy link

SebDyn commented Dec 14, 2012

Hello Phil,
For me, the current implementation works perfectly! Thanks for the effort!
Best regards
Sebastian

@SebDyn
Copy link

SebDyn commented Dec 17, 2012

May I add another point to my wish list again?

Test failures may lead to several results:

  • some test cases fail
  • the program unexpectedly terminates

The first case works rather well now. To catch the second case, we introduced a script that checks the exit code. The Main() routine of CATCH was wrapped such that a zero exit code is returned if any test cases failed. This is different if an uncatched exception or even a segmentation fault appears. In this case the exit code is scanned, and the script generates its own XML for JUnit/Jenkins that contains the error information on 'calling the executable'. With the new strategy of catching the std output, however, I can not track the output in case of a segfault since it is caught by CATCH. I do not know if it is possible to 'move' the std output if no segfault happens and to 'copy' the std output if a segfault happens. If this is not possible, maybe you can simply add an option to send output to both, the XML and to console? It would also be helpful if the console output would separate the std output of the individual test cases and sections somehow (eg. "Now start test case 'xy/z'").

Best regards
Sebastian

@SebDyn
Copy link

SebDyn commented Feb 11, 2013

Hi Phil!

Today I encountered a small bug in the JUnit reporter: On some occasions (Eg. if CHECK_THROWS fails) a wrong XML is written, where the element is empty, eg. < message=""> is written instead of . I've seen in the switch-case-block you simply do nothing for certain outcomes. , I changed it to:

                case ResultWas::Unknown:
                case ResultWas::FailureBit:
                case ResultWas::Exception:
                case ResultWas::DidntThrowException:
                    stats.m_element = "failure";
                    break;

Best regards
Sebastian

@philsquared
Copy link
Collaborator Author

Hmm... Sorry @SebDyn, I don't believe I saw your last two comments before (I suspect because of the way my mail client does threading).

For your segfault issue I intend a more comprehensive handling, as mentioned recently in: #160.

For the CHECK_THROWS issue I can confirm that DidntThrowException should have handled (the others not, as they are flags that are just there to stop warnings).
I've added the equivalent of your fix (on integration) and will push those changes shortly.

philsquared added a commit that referenced this issue Apr 8, 2013
- As mentioned by @SebDyn in GitHub issue #5
@wichert
Copy link
Contributor

wichert commented May 13, 2013

I tested this again today with the current (v0.9 build 38) single include from the integration branch. It works, but the created structure is not very optimal:

  • the output has single top level package called (root)
  • underneath the (root) package is a single class called global
  • underneath the global class all TEST_CASEs are lasted as test names

This completely wastes two levels of test structure which would be very useful to use, and if you use the same test case names in multiple source files (for example TEST_CASE("Constructor") it is impossible to tell them apart.

I would suggest to change the output as follows:

  • use the filename as the package name. So instead of a single (root) package you would have file1.cc, file2.cc. It would be incredibly nice to have a relative path from the top level build directory included, but I can image that is not possible.
  • move TEST_CASE to the send class level.
  • expose the top level SECTION inside a test case as the test name.

@philsquared
Copy link
Collaborator Author

@wichert thanks for the continuing feedback.

I recently (finally!) got Jenkins installed on my laptop (and I'm told it's now installed on the server where I run my CI builds (thanks Paul) - but I've not had a chance to set that up yet).
My first impression was, "wow it's much worse than I thought". Mostly in unfixable ways, due to limitations in the JUnit/Ant format itself. It's a real shame that it has become the de-facto standard - or at least the closest we have to one. Might have to see if we can change that.

For a while I was thinking it's so bad that the feedback is much nicer if you just run the console reporter and capture that!

But I will go back and improve things. As for your specific suggestions:

"use the filename as the package name"

  • sounds entirely sensible. ISTR thinking the same thing myself. Not sure about the path - will look into that.

"move TEST_CASE to the send class level"

  • this feels wrong, but in some ways it's appropriate. The "global" pseudo-class feels wrong too, so we're probably better off going for the more useful wrongness ;-) I'll look at that too.

"expose the top level SECTION inside a test case as the test name."

  • seems to follow on - although rather than the "top level SECTION" I'd concatenate the whole section path (much as I do in the console reporter now).

@wichert
Copy link
Contributor

wichert commented May 13, 2013

While junit is the most popular output format it is of course not the only one. Looking at the list of Jenkins plugins there are a bunch of alternatives: Cpptest, Gallio/MbUnit, JavaTest, JSUnit, NUnit and xUnit which supports a whole bunch of formats. Perhaps one of those will be a better fit.

@wichert
Copy link
Contributor

wichert commented May 13, 2013

It looks like all those plugins just convert various format to junit internally, so you are still bound to the three-level package/class/test structure.

@philsquared
Copy link
Collaborator Author

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.

@SebDyn
Copy link

SebDyn commented May 14, 2013

Hello,

here are my 5 cents since I do not fully agree with Wichert's ideas

I do not recommend to use the source file name for the class name. In my case I only have a single test case per source file…

In my environment I post-process the JUnit XML and replace the class name by the name of the test executable (without path). Each executable collects a set of test cases grouping them semantically in a single executable. So far I could live with "package name=(root)".

Here is my recommendation:
the "top level package" should be configurable (by command line). The same for the "class name". In this case I would use the following ordering:

"top level package" = name of executable
"class name" = either "test_cases" collecting the tests of CATCH
or "exit_code" collecting information on the exit status of the executable, additional info on crashes (eg. X11' DISPLAY variable or segfaults), or time out (eg. maximum allowed time of 20 minutes exceeded), or debugger output, etc.

I agree with having the SECTION definitions as part of the test names, eg.:

TEST_CASE("name_1/name_2", "description"){
CHECK(1==0)
SECTION("section_1")
{
CHECK(1==0)
}

could result in failed tests
(test_executable_name).test_cases.name_1/name_2
(test_executable_name).test_cases.name_1/name_2/SECTION:section_1

Then an important note on a bug (feature?) in Jenkins: Even if multiple tests failed within a single test case, Jenkins only displays ONE of them! As a workaround all failed tests should be concatenated into a single section.

It would be extremely useful to have the times used by a test case available in the JUnit output. It would be nice if CATCH detects if any "boost" time headers were included and use the boost time functions if available

#include <boost/date_time/posix_time/posix_time.hpp> //  before including catch.hpp

in catch.hpp:

#ifdef POSIX_TIME_HPP__
boost::posix_time::ptime t1,t2;
t1 = boost::posix_time::microsec_clock::universal_time();
…
t2 = boost::posix_time::microsec_clock::universal_time();
std::cout << (t2-t1).total_seconds();
#endif 

Best regards
Sebastian

PS: I am sorry for not replying, Phil. I use an older revision of CATCH in a production environment and currently I do not want to upgrade.

On May 14, 2013, at 8:46 AM, Phil Nash wrote:

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.


Reply to this email directly or view it on GitHub.

@martinmoene
Copy link
Collaborator

Comment by SebDyn above reformatted:

Hello,

here are my 5 cents since I do not fully agree with Wichert's ideas.

I do not recommend to use the source file name for the class name. In my case I only have a single test case per source file…

In my environment I post-process the JUnit XML and replace the class name by the name of the test executable (without path). Each executable collects a set of test cases grouping them semantically in a single executable. So far I could live with "package name=(root)".

Here is my recommendation:
the "top level package" should be configurable (by command line). The same for the "class name". In this case I would use the following ordering:

"top level package" = name of executable
"class name" = either "test_cases" collecting the tests of CATCH
or "exit_code" collecting information on the exit status of the executable, additional info on crashes (eg. X11' DISPLAY variable or segfaults), or time out (eg. maximum allowed time of 20 minutes exceeded), or debugger output, etc.

I agree with having the SECTION definitions as part of the test names, eg.:

TEST_CASE("name_1/name_2", "description"){
CHECK(1==0)
SECTION("section_1")
{
CHECK(1==0)
}

could result in failed tests
(test_executable_name).test_cases.name_1/name_2
(test_executable_name).test_cases.name_1/name_2/SECTION:section_1

Then an important note on a bug (feature?) in Jenkins: Even if multiple tests failed within a single test case, Jenkins only displays ONE of them! As a workaround all failed tests should be concatenated into a single section.

It would be extremely useful to have the times used by a test case available in the JUnit output. It would be nice if CATCH detects if any "boost" time headers were included and use the boost time functions if available

#include <boost/date_time/posix_time/posix_time.hpp> //  before including catch.hpp

in catch.hpp:

#ifdef POSIX_TIME_HPP__
boost::posix_time::ptime t1,t2;
t1 = boost::posix_time::microsec_clock::universal_time();
…
t2 = boost::posix_time::microsec_clock::universal_time();
std::cout << (t2-t1).total_seconds();
#endif 

Best regards
Sebastian

PS: I am sorry for not replying, Phil. I use an older revision of CATCH in a production environment and currently I do not want to upgrade.

On May 14, 2013, at 8:46 AM, Phil Nash wrote:

Therein, as they say, lies the rub.
Furthermore we're not just talking about Jenkins. Most CI servers support a range of formats but JUnit/Ant is usually the lowest-common-denominator.
So until someone comes up with a compelling alternative that they can encourage broad adoption of we're stuck with it.
It's not so much the three-tier structure that's the issue.
There's a general, "built for Java" air to it (which is natural, given its origins - it wasn't devised as a general purpose format). It's quite limited in what can be reported when and, perhaps most significantly, the failure counts are given as attributes in top level elements - which means you can't even start writing it until all tests have been run!

I wouldn't hold Catch's own XML format up as a shining paradigm of cross framework/ language support either (but maybe we should do something about that). But it has a few characteristics that are in the right direction already - especially that error counts are given in separate elements and at the end - so the report can be streamed as it is being written to.
But for CI use the richness of information that is captured is more important. Having places to put file/ line info and assertion expansions are especially welcome.


Reply to this email directly or view it on GitHub.

@wichert
Copy link
Contributor

wichert commented May 14, 2013

While I see some of the points from @SebDyn I have to disagree with others.

I indeed also tend to have very few and often just one test case per source file, so using the test case name only and ignoring the file name would be fine for me.

My current codebases are not large enough to warrant multiple CATCH test runners (I do have others, but those are python-based for python wrappers) so using the executable name did not occur to me. For large codebases I can see that that could be useful. I wonder what percentages of code bases is large enough to warrant that. For my situations it would be a shame to loose that hierarchy level.

@philsquared
Copy link
Collaborator Author

The code base I'm currently working on by day has about 12 Catch test executables (plus a few NUnit executables).
It's not so much about the volume as the partitioning. Each of our executables tests a specific component in our system. Some are quite large. Some have just a handful of tests in.

I think process name is the logical fit for the package name. There's possibly scope for making that configurable, though. When I get back to looking at this again I'll take that into consideration.

Thanks for your additional comments @SebDyn - I'll ruminate on them more when I'm back in the right context to think about it.

@m-mcgowan
Copy link

Totally awesome library - so novel and unique and usable!

I just hit a little snag with the junit reporter causing UnitTH to crash - there was a missing name attribute in the testsuite element. The name comes from the testGroup which is set to empty in

context.testGroupStarting( "", 1, 1 ); // deprecated?

It would be great if we could group tests into some higher level grouping,e.g.

TEST_SUITE("Chugger") {
    TEST_CASE(...) {

    };
    TEST_CASE(...) {
    } 
}

(I realize this is outside the scope of this issue, which is the junit reporter, but having the TEST_SUITE defined will then provide a name for the <test-suite> elements in the junit report.) In my case, I'd probably organize at least each cpp module as a test group, and for large modules, break these down into subgroups - loosely mimicking the package structure that junit had.

@wichert
Copy link
Contributor

wichert commented Sep 5, 2014

The test grouping is also requested in #320.

@m-mcgowan
Copy link

Yes, by me - sorry for the cross post, but the two are related.

I'm finding tags a bit of a pain, and I often forget. For example, I just made unit tests for a PRNG, in random.cpp. Each test then has as a minimum one tag [random]. Would be great to simply wrap the whole lot in a big group.

@philsquared
Copy link
Collaborator Author

I've moved this part of the discussion back over to #320

@Trass3r
Copy link

Trass3r commented May 4, 2015

Is there anything left for this one?

@SebDyn
Copy link

SebDyn commented May 11, 2015

Is there anything left for this one?

I just tried recent Catch 1.1 build 1 and frankly I need to reply with: No. It is - again - not really working.

Here is the reason:

  • Old CATCH (working):

    <testsuite errors="0" failures="0" tests="107" hostname="0" time="0.140043" timestamp="0">
    
  • Current Catch (not working):

    <testsuite name="all tests" errors="0" failures="0" tests="1074" hostname="0" time="19.5793" timestamp="0">
    

I use Jenkins 1.596.2 (stable) with JUnit reporter. The interesting thing is: Jenkins does not return with an error. It just does not display the test results when using the current CATCH output.

@SebDyn
Copy link

SebDyn commented May 11, 2015

Ok, it seems that Jenkins has a bug somewhere. The "name" keyword ist a mandatory field in JUnit 6 standard. We continue investigating this issue.

@alt-
Copy link

alt- commented Oct 2, 2015

The fields with "tbd" really need to go or be fixed.

Jenkins' Junit plugin performs duplication checks on inputs by comparing the name, id and timestamp fields (https://github.com/jenkinsci/junit-plugin/blob/master/src/main/java/hudson/tasks/junit/TestResult.java#L240).

If the timestamp field would be missing, it would work correctly (strictEq), but the dummy value "tbd" matches the check and Jenkins throws away all other test results.

@horenmar
Copy link
Member

horenmar commented Feb 8, 2017

Timestamp should now be filled with proper UTC-based time, at the time of writing the results, and most other issues raised in this thread be addressed as well.

If there are some issues remaining, please open a new issue for them.

@horenmar horenmar closed this as completed Feb 8, 2017
horenmar added a commit that referenced this issue Nov 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants