Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Report no results as error #144

Merged
merged 12 commits into from Oct 4, 2020
Merged

Report no results as error #144

merged 12 commits into from Oct 4, 2020

Conversation

SamTheisens
Copy link
Contributor

@SamTheisens SamTheisens commented Sep 27, 2020

Feature proposal

Test suites that fail to run altogether due to a syntax error, failed import, compilation error, etc. are currently not reported in the junit.xml results.
This is a problem in CI scenarios where the test process exit code cannot be relied on for whatever reason.

I realize this feature has been requested in #116, #46 and #26, so I'm sorry bring this up again :-)
Jest's test reports do not contain any information about the suite and tests that may be present in the test file, but failed to load. On the other hand it isn't clear to me how Jest could reliably obtain such information from an invalid javascript file.

For our use case, the fact that all error suites are reported is much more important than having stable test (suite) names.
But there may indeed be scenarios in which these priorities are reversed. In such cases users can choose not to use
this feature.

The report when reportNoResultsAsError is (actively) set to true looks as follows:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites name="jest tests" tests="1" failures="0" errors="1" time="111548453.117">
    <testsuite name="../../../../../path/to/spec/test.spec.ts" errors="1" failures="0" skipped="0" timestamp="2017-07-13T00:03:35" time="5.523" tests="0">
        <properties>
            <property name="best-tester" value="Jason Palmer"/>
        </properties>
        <testcase classname="Test suite failed to run" name="../../../../../path/to/spec/test.spec.ts" time="0">
            <error>spec/test.spec.ts:10:35 - error TS2339: Property &apos;hello&apos; does not exist on type &apos;HelloScreamer

                10         const screamed = screamer.hello();
                ~~~~~</error>
        </testcase>
    </testsuite>
    <testsuite name="another thing" errors="0" failures="0" skipped="0" timestamp="2018-02-10T14:52:31" time="0.073" tests="1">
        <properties>
            <property name="best-tester" value="Jason Palmer"/>
        </properties>
        <testcase classname="another thing should foo" name="another thing should foo" time="0.001">
        </testcase>
    </testsuite>
</testsuites>

@SamTheisens SamTheisens marked this pull request as ready for review September 27, 2020 16:39
@palmerj3
Copy link
Collaborator

Hello!

As you noted in your description this is a very long running problem and one that I would love to solve.

I see two things that come to mind when reading through this PR:

  1. I don't think jest-junit should be used to ensure your test suite does "exit 1" in CI. And it seems like the main use-case here is for essentially showing an error.

  2. I think it's absolutely critical that we maintain deterministic test suite / test case names. Otherwise if you have systems that track test runs "Error while trying to run test file test.spec.ts" will now become a new test suite.

I do really like the idea you present of this being an opt-in setting. My ideal solution to this problem would be for jest-junit to possibly maintain a local cache of filepath->test suite(s) lookup table. So then if we detect an entire test file wasn't able to run due to an error we could fill in test suites and test cases and properly mark them as errors instead of failures.

So while I REALLY appreciate the time you put into this PR I think I would prefer not to merge this into the main branch and I would suggest you fork jest-junit if you require this particular implementation.

@SamTheisens
Copy link
Contributor Author

SamTheisens commented Sep 28, 2020

Hello!

Hi!

As you noted in your description this is a very long running problem and one that I would love to solve.

I see two things that come to mind when reading through this PR:

  1. I don't think jest-junit should be used to ensure your test suite does "exit 1" in CI. And it seems like the main use-case here is for essentially showing an error.

Jest-junit no longer filtering out errored suites is indeed the main use-case for me. At my company, we rely entirely on the junit test results to decide whether to fail or pass the test stage of a build.
Since suites that fail to load do not end up in the report, we end up marking the build as successful when it's not justified.
To prevent this from happening, we add an error suite (much like this PR does) when jest exits with 1. It's an ugly solution especially because we can't include any information about which particular test file failed, let alone with what error.

  "scripts": {
    "test-junit:insert-test-suite": "node add-error-suite-to-junit.js ./target/test-reports/junit.xml",
    "test-ci": "jest --reporters=jest-junit || yarn test-junit:insert-test-suite"
  },
  1. I think it's absolutely critical that we maintain deterministic test suite / test case names. Otherwise if you have systems that track test runs "Error while trying to run test file test.spec.ts" will now become a new test suite.

I'm sorry, the example junit.xml output I added was outdated. The current implementation uses {filepath} as suite name https://github.com/jest-community/jest-junit/pull/144/files#diff-de6bb0568cec60450b473561b52bbf0dR87
which is the most stable suite name I could think of. A more flexible solution could be to introduce another template for errored suites e.g. errrorSuiteNameTemplate in the context of which only FILENAME_VAR and FILEPATH_VAR could be used.

In our use-case, having stable suite and/or test names is really not very critical at all. After all, suites, files and tests can be renamed anyway.
However, having a correct test output, that includes suites that failed to load, is critical.

My guess is that most jest-junit users would have their priorities that way, or at least would like to be given the opportunity to do the tradeoff for themselves.

I do really like the idea you present of this being an opt-in setting. My ideal solution to this problem would be for jest-junit to possibly maintain a local cache of filepath->test suite(s) lookup table. So then if we detect an entire test file wasn't able to run due to an error we could fill in test suites and test cases and properly mark them as errors instead of failures.

I guess that would allow you to deal with the suite error cases more independently?
I'd be happy to have a stab at such an implementation.

So while I REALLY appreciate the time you put into this PR I think I would prefer not to merge this into the main branch and I would suggest you fork jest-junit if you require this particular implementation.

I am not attached to my particular implementation, but I am rather attached to being able to continue using this great library!

@palmerj3
Copy link
Collaborator

At my company, we rely entirely on the junit test results to decide whether to fail or pass the test stage of a build.
Since suites that fail to load do not end up in the report, we end up marking the build as successful when it's not justified.

I'm really curious why you would do this and not just rely on exit codes for pass/fail like most/all do?
This, in particular, is something the main branch of jest-junit cannot adequately support.

I guess that would allow you to deal with the suite error cases more independently?
I'd be happy to have a stab at such an implementation.

Yeah something like I suggested would still allow for stable between builds test suite and test case names AND properly mark them as errors. The only annoying part is it would require a file to be checked in with the repo (the cache) and I imagine some folks only invoke jest-junit on master builds.

@palmerj3
Copy link
Collaborator

One additional thing to note is an important distinction.

Jest itself considers each individual test file a test suite. But jest-junit does the correct thing by treating each describe block as a test suite.

Given that you can have multiple adjacent describe blocks per file: using just the filepath as the test suite name in order to report errors would not work for those who decide to have multiple describe blocks per test file.

@SamTheisens
Copy link
Contributor Author

SamTheisens commented Sep 28, 2020

I'm really curious why you would do this and not just rely on exit codes for pass/fail like most/all do?
This, in particular, is something the main branch of jest-junit cannot adequately support.

Probably most, but definitely not all rely on exit codes #26 (comment) :-)

Our situation is as follows:
We use Jenkins pipelines on a mono repo to build services and frontends on various stacks (Scala, Typescript, Ruby) in parallel.
Screen Shot 2020-09-28 at 20 53 29

All of these projects produce the same type of artifact: a docker container. After the test stage, the docker container will contain the test results, which will then be extracted from it by the pipeline and filed by Jenkins.
If we let the RUN yarn test:ci command return an exit code, that docker layer will not be committed and we won't be able to extract any files from the image.
We would have to dig into the logs to find the root cause of the failing suite instead of being able to rely on the neat test result visualization in Jenkins.
Screen Shot 2020-09-28 at 20 53 59

The fact that we can't rely on the exit code from jest is not the biggest issue though. We work around it by adding a dummy suite as described before. The main problem is that without error suites ending up in the the central overview of all our (junit) test results, we need to dig into the logging to find the offending suite(s).

@palmerj3
Copy link
Collaborator

Gotcha this is helpful.

I'm happy to chat about this somehow via hangout/zoom/etc. I would say if you want the current implementation I would suggest forking the repo.

But if you're willing to invest the time to develop a solution that will work for all/most that use jest-junit I can chat in more detail about this.

@k-yle
Copy link

k-yle commented Oct 1, 2020

Hi, I was about to try make a PR when I saw this one

  1. I don't think jest-junit should be used to ensure your test suite does "exit 1" in CI. And it seems like the main use-case here is for essentially showing an error.

Similar to @SamTheisens's case, we use jest-junit in gitlab, which shows which tests failed from the PR view. However, it doesn't show test suites that failed because nothing is generated for those files.

  1. I think it's absolutely critical that we maintain deterministic test suite / test case names. Otherwise if you have systems that track test runs "Error while trying to run test file test.spec.ts" will now become a new test suite.

In our use case with GitLab, this doesn't matter since we just need something to show that a particular test suite failed. GitLab doesn't care if the test suite names change; it just checks whether anything in the output has failed and lists those.

So we would really appreciate if this could get merged as an opt-in feature. Maintaining a separate fork seems counter productive if this could simply be opt-in, and perhaps labelled as an "experimental" feature.

Happy to help in anyway I can (:

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 2, 2020

This is all great feedback, thank you.

Given that a solution hasn't been found (by me or others) for a few years now on this issue I think I'm ok with moving forward and implementing a change that would:

  1. Report any test suite(s) that failed due to error and mark them as "error" not "failure"
  2. Use the file path as the test suite name
  3. Is opt-in. So someone would have to set an option like "reportTestSuiteErrors".
  4. The opt-in flag documentation on the README makes folks aware that it will choose a default filepath-based test suite name

If anyone wants to take a crack at doing all of that I'm happy to review the PR.
Otherwise I'm also happy to implement it myself.

@SamTheisens
Copy link
Contributor Author

That is great news! We are using the fork with this PR in our infrastructure, but it would be nice if we (and others) don't need to rely on a fork.
Execution errors now show up in our Jenkins reports.
Screen Shot 2020-10-03 at 13 11 19

Run time suite level failures are reported like this:

<testsuite name="integration-tests/reporter/__tests__/simple.test.js" errors="1" failures="0" skipped="0" timestamp="1970-01-01T00:00:00" time="0" tests="0">
  <properties>
    <property name="best-tester" value="Jason Palmer"/>
  </properties>
  <testcase classname="Test suite failed to run" name="integration-tests/reporter/__tests__/simple.test.js" time="0">
    <error>  ● Test suite failed to run

  None shall pass!

    7 |   });
    8 | });
  &gt; 9 | throw new Error(&quot;None shall pass!&quot;)
      |       ^

    at Object.&lt;anonymous&gt; (__tests__/simple.test.js:9:7)
    </error>
  </testcase>
</testsuite>

For syntax errors, Jest adds quite an extensive message:

<testsuite name="integration-tests/reporter/__tests__/simple.test.js" errors="1" failures="0" skipped="0" timestamp="1970-01-01T00:00:00" time="0" tests="0">
  <properties>
    <property name="best-tester" value="Jason Palmer"/>
  </properties>
  <testcase classname="Test suite failed to run" name="integration-tests/reporter/__tests__/simple.test.js" time="0">
    <error>  ● Test suite failed to run

  Jest encountered an unexpected token

  This usually means that you are trying to import a file which Jest cannot parse, e.g. it&apos;s not plain JavaScript.

  By default, if Jest sees a Babel config, it will use that to transform your files, ignoring &quot;node_modules&quot;.

  Here&apos;s what you can do:
   • To have some of your &quot;node_modules&quot; files transformed, you can specify a custom &quot;transformIgnorePatterns&quot; in your config.
   • If you need a custom transformation specify a &quot;transform&quot; option in your config.
   • If you simply want to mock your non-JS modules (e.g. binary assets) you can stub them out with the &quot;moduleNameMapper&quot; config option.

  You&apos;ll find more details and examples of these config options in the docs:
  https://jestjs.io/docs/en/configuration.html

  Details:

  SyntaxError: /Users/samtheisens/IdeaProjects/tmp/jest-junit/integration-tests/reporter/__tests__/simple.test.js: Unexpected token (2:0)

    1 | import something
  &gt; 2 | describe(&apos;foo&apos;, () =&gt; {
      | ^
    3 |   it(&apos;should pass&apos;, () =&gt; {
    4 |     expect(true).toEqual(true);
    5 |   });

    at Parser._raise (../../node_modules/@babel/parser/src/parser/error.js:60:45)
    </error>
  </testcase>
</testsuite>

@palmerj3 I think this PR meets the 4 requirements you listed now.
I'd be happy to discuss further in person. Will try to contact you.

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 3, 2020

Very cool! I'm going to take some time and review this. But just wanted to leave a note here so you don't think I forgot!

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 3, 2020

It looks good to me for the most part. Just see if you can detect a suite with zero tests (that works fine otherwise) vs a test suite that has an uncaught error and cannot be parsed/executed. Depending on the outcome let's rename the opt-in option and make it consistent between the environment variable and reporter option.

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 4, 2020

Also do me a favor and please rebase. I just pushed a change which will help me review PRs more effectively moving forward (sorry about that). It just validates that the outputted junit is valid according to the jenkins spec.

#146

because the template configuration applies to the entire report
to prevent potential setting overrides from spilling over to other
suites.
 - adds `reportNoResultsAsError` configuration parameter. Default is
   false
 - when top level test result (suite) has no children, a stub suite is
   added with a single test with `error` status. The exception is
   displayed at test level.
 - any potential suite or title templates are overridden in order to
   guarantee that the file path is displayed. Since besides the
   `testExecError`, this is the only information we have.
and the file path is used as suite title.
and explains in the README that file path will be used as suite name in
these cases
so that the validity of the resulting xml can be verified
to ensure that it is handled as suite error
because `suite.testExecError` is a varying complex type and the information is
also reflected in `suite.failureMessage` anyway.
@SamTheisens
Copy link
Contributor Author

It looks good to me for the most part. Just see if you can detect a suite with zero tests (that works fine otherwise) vs a test suite that has an uncaught error and cannot be parsed/executed. Depending on the outcome let's rename the opt-in option and make it consistent between the environment variable and reporter option.

It looks like empty suites are not accepted by Jest and reported as suite errors.
If I try to run the following:

describe('foo', () => {
});

I get this result:

<testsuite name="integration-tests/reporter/__tests__/simple.test.js" errors="1" failures="0" skipped="0" timestamp="1970-01-01T00:00:00" time="0" tests="0">
  <properties>
    <property name="best-tester" value="Jason Palmer"/>
  </properties>
  <testcase classname="Test suite failed to run" name="integration-tests/reporter/__tests__/simple.test.js" time="0">
    <error>  ● Test suite failed to run

  Your test suite must contain at least one test.

    at onResult (../../node_modules/@jest/core/build/TestScheduler.js:175:18)
        at Array.map (&lt;anonymous&gt;)
    </error>
  </testcase>
</testsuite>

Given this fact, I guess reportTestSuiteErrors is still a good name for the opt-in option?

I have removed the fallback from suite.testExecError to suite.FailureMessage, because suite.FailureMessage seems like a safer option and also contain all information 7af55a7

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 4, 2020

This looks good to me! I did some testing locally and it's unfortunate that jest doesn't even pull a suite name from a functioning suite with no tests. I was thinking in that case we could still reuse at least some of the template variables.

But it seems best to do as is done in this PR. Treat an uncaught error and a suite with no tests the same.

Are there any other changes you'd like to make before merging this?

@SamTheisens
Copy link
Contributor Author

This looks good to me! I did some testing locally and it's unfortunate that jest doesn't even pull a suite name from a functioning suite with no tests. I was thinking in that case we could still reuse at least some of the template variables.

Yeah, that's too bad. Although I guess it's a pretty rare case that you'd want to address immediately anyway.

Are there any other changes you'd like to make before merging this?

Thank you for asking! But no, I'm happy.

@palmerj3
Copy link
Collaborator

palmerj3 commented Oct 4, 2020

Thanks for all the hard work on this! Merging now and will cut a new major release including this and the change I made earlier.

@palmerj3 palmerj3 merged commit 4e21b20 into jest-community:master Oct 4, 2020
This was referenced Jun 6, 2021
@gazal-k
Copy link

gazal-k commented Dec 26, 2022

Should reportTestSuiteErrors be set to true by default? Seems like a sensible default.

@palmerj3
Copy link
Collaborator

If I were building jest-junit from scratch today I would have it on by default. But millions of teams rely on it now every day and turning that on can have some side effects. If teams read the junit.xml files and create a database/index out of them then the new blank test suites this feature creates in certain cases could be confusing.

@gazal-k
Copy link

gazal-k commented Dec 26, 2022

Understood, thanks. Especially tricky considering jest-junit doesn't use semver. Otherwise, it could have been a "breaking change" and hence a new major version with appropriate changelog information for users to take appropriate action when upgrading.

@palmerj3
Copy link
Collaborator

It does use Semver and we have introduced breaking changes in the past.

But this is one case where I've been extra cautious. It's an option folks can turn on and the burden is not high.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants