Description
On the last JUG I met @sormuras and he motivated me to participate for the first time by presenting a feature-request. So here it is.
Motivation:
When we write tests we do this mostly because of two reasons:
- Avoid technical errors (like NPE)
- Ensure our code fullfills the defined requirements / avoid functional errors
But as of today (as far as I know) the only way to "link" a test to a requirement is by using it's (displayed) name, for example putting the id of the requirement (I'll call this "req-id" in the following lines) in front of the name - like req123_testSomethingInThisMethod
. I think you all agree that this is not a good way to show that this method is a test to ensure the requirement with the id req123
.
Suggestion:
Therefore I would like to suggest do add a new annotation @Requirement
.
This annotation should be used to show that a particular test is written to ensure the specific requirement. The annotation takes the req-id as a string parameter, e.g. (@Requirement("REQ-123")
). In the test result this req-id is then published as an attribut of the test case. This allows tools which parse the result to show if all tests annotated with a given req-id have passed and therefore the requirement is fullfilled.
I don't see the need of making this annotation repeatable on a method. In my opinion each test method should only verify one aspect. I also don't see the need of using this annotation on class level as I in my opinion a test calls - especially for small methods - may contain technical and functional tests.
Maybe this can also be taken into the considerations about a standard test result format (ota4j-team/opentest4j#9)
Lookout:
To be honest only printing an additional attribut containing the id of a requirment is only one part of a useful funcitonality to check if all requirements are fullfilled. As without a list of definied requirements the report only containts information about tests which are annotation with a requirment but the report (tool) doesn't know if there were tests to all requirements. While I'm quite sure the suggested annotation should be a feature of JUnit I'm not sure if the "linking functionality" is seen as one by the JUnit-Team. I'll make a short description of what I mean, then maybe it's more clear:
To compare a list of requirements to the list of test results a comparator needs an input of this list. So I see the need of some module which takes this list (e.g. a file, service call, etc.) as an input and compares it with the test results and then creates a report which shows the reader which requirements don't have test at all, or which tests of a given requirement have passed/failed.
But maybe this is just a second step and - as mentioned - maybe not even in focus of JUnit.