Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JUnit Reporter for Doctest #376

Closed
phil-zxx opened this issue May 21, 2020 · 7 comments
Closed

JUnit Reporter for Doctest #376

phil-zxx opened this issue May 21, 2020 · 7 comments

Comments

@phil-zxx
Copy link

phil-zxx commented May 21, 2020

I currently use catch2 with --reporter=junit. Here is some junit sample output:

<testsuite failures="1" tests="3" time="1.5" timestamp="2020-05-21T22:46:42Z">
    <testcase classname="foo1" name="ASuccessfulTest" time="0.1"/>
    <testcase classname="foo2" name="AnotherSuccessfulTest" time="0.6"/>
    <testcase classname="foo3" name="AFailingTest" time="0.8">
        <failure message="A not equal to B" type="CHECK"> details about failure </failure>
    </testcase>
</testsuite>

As I wanted to move things over to doctest I noticed there was no junit reporter available. So I wrote a first version, which works for my purposes. But I am happy to receive some additional feedback, and possibly submit a PR.

struct JUnitReporter : public IReporter
{
    XmlWriter  xml;
    std::mutex mutex;

    struct JUnitTestCaseData
    {
        static std::string getCurrentTimestamp() {
            // Beware, this is not reentrant because of backward compatibility issues
            // Also, UTC only, again because of backward compatibility (%z is C++11)
            time_t rawtime;
            std::time(&rawtime);
            auto const timeStampSize = sizeof("2017-01-16T17:06:45Z");

            #ifdef _MSC_VER
                std::tm timeInfo = {};
                gmtime_s(&timeInfo, &rawtime);
            #else
                std::tm* timeInfo;
                timeInfo = std::gmtime(&rawtime);
            #endif

            char timeStamp[timeStampSize];
            const char* const fmt = "%Y-%m-%dT%H:%M:%SZ";

            #ifdef _MSC_VER
                std::strftime(timeStamp, timeStampSize, fmt, &timeInfo);
            #else
                std::strftime(timeStamp, timeStampSize, fmt, timeInfo);
            #endif
            return std::string(timeStamp);
        }

        struct JUnitTestMessage
        {
            JUnitTestMessage(const std::string& message, const std::string& type, const std::string& details)
                : message(message), type(type), details(details) { }

            JUnitTestMessage(const std::string& message, const std::string& details)
                : message(message), type(), details(details) { }

            std::string message, type, details;
        };

        struct JUnitTestCase
        {
            JUnitTestCase(const std::string& classname, const std::string& name)
                : classname(classname), name(name), time(0), failures() { }

            std::string classname, name;
            double time;
            std::vector<JUnitTestMessage> failures, errors;
        };

        void add(const std::string& classname, const std::string& name)
        {
            testcases.emplace_back(classname, name);
        }
        
        void addTime(const double& time)
        {
            testcases.back().time = time;
            totalSeconds += time;
        }

        void addFailure(const std::string& message, const std::string& type, const std::string& details)
        {
            testcases.back().failures.emplace_back(message, type, details);
            ++totalFailures;
        }

        void addError(const std::string& message, const std::string& details)
        {
            testcases.back().errors.emplace_back(message, details);
            ++totalErrors;
        }

        std::vector<JUnitTestCase> testcases;
        double totalSeconds = 0;
        int totalErrors = 0, totalFailures = 0;
    };

    JUnitTestCaseData testCaseData;
    

    // caching pointers/references to objects of these types - safe to do
    const ContextOptions& opt;
    const TestCaseData*   tc = nullptr;

    JUnitReporter(const ContextOptions& co)
            : xml(*co.cout)
            , opt(co) {}

    unsigned line(unsigned l) const { return opt.no_line_numbers ? 0 : l; }

    void test_case_start_impl(const TestCaseData& in) {
        testCaseData.add(skipPathFromFilename(in.m_file.c_str()), in.m_name);

        bool open_ts_tag = false;
        if(tc != nullptr) { // we have already opened a test suite
            if(std::strcmp(tc->m_test_suite, in.m_test_suite) != 0) {
                xml.endElement();
                open_ts_tag = true;
            }
        }
        else {
            open_ts_tag = true; // first test case ==> first test suite
        }
    }

    // =========================================================================================
    // WHAT FOLLOWS ARE OVERRIDES OF THE VIRTUAL METHODS OF THE REPORTER INTERFACE
    // =========================================================================================

    void report_query(const QueryData& in) override {
        test_run_start();
        if(opt.list_reporters) {
            for(auto& curr : getListeners())
                xml.scopedElement("Listener")
                        .writeAttribute("priority", curr.first.first)
                        .writeAttribute("name", curr.first.second);
            for(auto& curr : getReporters())
                xml.scopedElement("Reporter")
                        .writeAttribute("priority", curr.first.first)
                        .writeAttribute("name", curr.first.second);
        } else if(opt.count || opt.list_test_cases) {
            for(unsigned i = 0; i < in.num_data; ++i) {
                xml.scopedElement("TestCase").writeAttribute("name", in.data[i]->m_name)
                    .writeAttribute("testsuite", in.data[i]->m_test_suite)
                    .writeAttribute("filename", skipPathFromFilename(in.data[i]->m_file.c_str()))
                    .writeAttribute("line", line(in.data[i]->m_line));
            }
            xml.scopedElement("OverallResultsTestCases")
                    .writeAttribute("unskipped", in.run_stats->numTestCasesPassingFilters);
        } else if(opt.list_test_suites) {
            for(unsigned i = 0; i < in.num_data; ++i)
                xml.scopedElement("TestSuite").writeAttribute("name", in.data[i]->m_test_suite);
            xml.scopedElement("OverallResultsTestCases")
                    .writeAttribute("unskipped", in.run_stats->numTestCasesPassingFilters);
            xml.scopedElement("OverallResultsTestSuites")
                    .writeAttribute("unskipped", in.run_stats->numTestSuitesPassingFilters);
        }
        xml.endElement();
    }

    void test_run_start() override { }

    void test_run_end(const TestRunStats& p) override {
        std::string binary_name = skipPathFromFilename(opt.binary_name.c_str());
        xml.startElement("testsuites");
        xml.startElement("testsuite").writeAttribute("name", binary_name)
                .writeAttribute("errors", testCaseData.totalErrors)
                .writeAttribute("failures", testCaseData.totalFailures)
                .writeAttribute("tests", p.numAsserts)
                .writeAttribute("time", testCaseData.totalSeconds)
                .writeAttribute("doctest_version", DOCTEST_VERSION_STR)
                .writeAttribute("timestamp", JUnitTestCaseData::getCurrentTimestamp());

        for (const auto& testCase : testCaseData.testcases)
        {
            xml.startElement("testcase")
                .writeAttribute("classname", testCase.classname)
                .writeAttribute("name", testCase.name)
                .writeAttribute("time", testCase.time);

            for (const auto& failure : testCase.failures)
            {
                xml.startElement("failure")
                    .writeAttribute("message", failure.message)
                    .writeAttribute("type", failure.type)
                    .writeText(failure.details);
                xml.endElement();
            }
            
            for (const auto& error : testCase.errors)
            {
                xml.startElement("error")
                    .writeAttribute("message", error.message)
                    .writeText(error.details);
                xml.endElement();
            }

            xml.endElement();
        }
    }

    void test_case_start(const TestCaseData& in) override {
        test_case_start_impl(in);
    }
    
    void test_case_reenter(const TestCaseData&) override { }

    void test_case_end(const CurrentTestCaseStats& st) override {
        testCaseData.addTime(st.seconds);
    }

    void test_case_exception(const TestCaseException& e) override {
        std::lock_guard<std::mutex> lock(mutex);
        testCaseData.addError("exception", e.error_string.c_str());
    }

    void subcase_start(const SubcaseSignature& in) override {
        std::lock_guard<std::mutex> lock(mutex);
        testCaseData.add(skipPathFromFilename(in.m_file), in.m_name.c_str());
    }

    void subcase_end() override { }

    void log_assert(const AssertData& rb) override {
        if(!rb.m_failed && !opt.success)
            return;

        std::lock_guard<std::mutex> lock(mutex);

        std::ostringstream os;
        os << skipPathFromFilename(rb.m_file) << "(" << rb.m_line << "): Expression " << rb.m_expr << " became " << rb.m_decomp.c_str();
        testCaseData.addFailure(rb.m_decomp.c_str(), assertString(rb.m_at), os.str());
    }

    void log_message(const MessageData&) override { }

    void test_case_skipped(const TestCaseData&) override { }
};

DOCTEST_REGISTER_REPORTER("junit", 0, JUnitReporter);
@onqtam
Copy link
Member

onqtam commented May 22, 2020

Thanks! This has been requested quite a bit (and elsewhere)! I'll look into it.

@onqtam onqtam mentioned this issue May 22, 2020
@byzantic
Copy link

Good Work! I have given this a a quick spin, and:

  1. It compiles ok and works as advertised
  2. The .xml file is correctly recognised and parsed by GitLab - shows up as a junit test in CI jobs

I guess that's just one datapoint though - anyone tried Jenkins or other CI servers to parse the output?

@byzantic
Copy link

I have found one slight problem.

If I run with the success option, (i.e. testmain -s --reporters=junit, Then the reporter reports all failures and successes as failures, for example with the factorial example:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
  <testsuite name="prod/doctest/test" errors="0" failures="5" tests="5" time="0.000197" doctest_version="2.3.8" timestamp="2020-05-24T17:14:02Z">
    <testcase classname="prod/doctest/example.cpp" name="testing the factorial function" time="0.000197">
      <failure message="0 == 1" type="CHECK">
        prod/doctest/example.cpp(7): Expression factorial(0) == 1 became 0 == 1
      </failure>
      <failure message="1 == 1" type="CHECK">
        prod/doctest/example.cpp(8): Expression factorial(1) == 1 became 1 == 1
      </failure>
      <failure message="2 == 2" type="CHECK">
        prod/doctest/example.cpp(9): Expression factorial(2) == 2 became 2 == 2
      </failure>
      <failure message="6 == 6" type="CHECK">
        prod/doctest/example.cpp(10): Expression factorial(3) == 6 became 6 == 6
      </failure>
      <failure message="3628800 == 3628800" type="CHECK">
        prod/doctest/example.cpp(11): Expression factorial(10) == 3628800 became 3628800 == 3628800
      </failure>
    </testcase>
  </testsuite>
</testsuites>

@ARCRL
Copy link

ARCRL commented Jun 4, 2020

Great work. As I'm pressed for time and couldn't write my own implementation, this worked great as a proof of concept, as my place of work is trying out Doctest with Azure Pipelines.

I have a few notes though. And sorry that I can't be more helpful atm with respect to providing solutions.

  • For me it didn't compile out of the box, had to remove ".c_str()" two places.
  • And it was missing the end tags for testsuite and testsuites.

I totally agree with onqtam, I think that it would be best/easiest to get as close as possible to the Catch2/ant JUnit format - it seems to be working great with Azure Pipelines. With respect to achieving this I have the following pointers.

  1. Nested SUBCASES could be handled differently. Right now each visit to a SUBCASE results in a TESTCASE entry in the JUnit file. I.e. if you have a test structure like this
TEST_CASE("1")
+-- SUBCASE("1.1")
   +-- SUBCASE("1.1.1")
   +-- SUBCASE("1.1.2")

It would result in an output like this (Ignoring all attributes except for names):

<testsuites>
  <testsuite name="<name of the executable>">
    <testcase name="1"/>
    <testcase name="1.1"/>
    <testcase name="1.1.1"/>
    <testcase name="1.1"/>
    <testcase name="1.1.2"/>
  </testsuite>
</testsuites>

Meaning that even though there are only 2 tests (1.1.1 and 1.1.2), it will appear as if there are 5. Where as the output from Catch2 would look like this (my choice of names makes it look more confusing then it is)

<testsuites>
  <testsuite name="<name of the executable>">
    <testcase name="1/1.1/1.1.1"/>
    <testcase name="1/1.1/1.1.2"/>
  </testsuite>
</testsuites>
  1. The times appears to be off. The time that is used that of the TEST_CASE (in doctest terminology), but it is assigned to the last SUBCASE within that TEST_CASE, where all other entries are assigned a time of zero. Continuing with the example from above it would look like
<testsuites>
 <testsuite name="<name of the executable>">
<!-- This should have had a time of 42 -->
   <testcase name="1" time="0"/>
<!-- And the rest a value of zero or better yet their actual execution time -->
   <testcase name="1.1" time="0"/>
   <testcase name="1.1.1" time="0"/>
   <testcase name="1.1" time="0"/>
   <testcase name="1.1.2" time="42"/>
 </testsuite>
</testsuites>

In my opinion (which would also be necessary if the Catch2 format is used) it would be nicer with the execution time of each SUBCASE.

  1. Lastly I would like to draw attention to Catch2s way of reporting failures. Below is an example of the text in the body of a failure.
FAILED:
CHECK( some_string != "PONG" )
with expansion:
"PING" != "PONG"
at path/to/file/TEST.cpp:20

What I like most about this is that it is not just a single (very long) line, but they have used line breaks to make it more readable.

I hope these comments help!
I will see if I can get time to help with a solution in the near future, although I unfortunately doubt it...

@onqtam
Copy link
Member

onqtam commented Jun 4, 2020

@ARCRL those are some great tips - thanks! I'll try to incorporate all this into an initial version of the reporter in some weekend of June and will notify everyone here.

onqtam added a commit that referenced this issue Jun 4, 2020
…few TODOs right above the JUnit reporter class definition which would need to be addressed + the output hasn't been inspected if it's correct or stable across compilers/platforms - WIP! relates #376 and #318 - thanks to @phil-zxx for the implementation and to @byzantic , @ARCRL and @dhoer for their inputs!
@phil-zxx
Copy link
Author

phil-zxx commented Jun 5, 2020

Glad I could help. :) And sorry it wasn't fully finished.

onqtam added a commit that referenced this issue Jun 11, 2020
…few TODOs right above the JUnit reporter class definition which would need to be addressed + the output hasn't been inspected if it's correct or stable across compilers/platforms - WIP! relates #376 and #318 - thanks to @phil-zxx for the implementation and to @byzantic , @ARCRL and @dhoer for their inputs!
onqtam added a commit that referenced this issue Jun 26, 2020
…- thanks to @phil-zxx for the implementation and to @byzantic , @ARCRL and @dhoer for their inputs!
@onqtam
Copy link
Member

onqtam commented Jun 26, 2020

Just pushed a finished version of it in the dev branch - will release doctest 2.4.0 probably tomorrow - once the CI passes!
@ARCRL thanks for all the tips - I addressed all your comments!

@onqtam onqtam closed this as completed in 1fb630b Jun 27, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants