Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[JENKINS-28924] Added support for disabling java specifics. #15

Merged
merged 3 commits into from Feb 9, 2024

Conversation

HMS-JJO
Copy link

@HMS-JJO HMS-JJO commented Aug 14, 2015

JENKINS-28924

Added option to disable java specifics and allow for parallelized tests of code in other languages.

@jenkinsadmin
Copy link
Member

Thank you for a pull request! Please check this document for how the Jenkins project handles pull requests

@HMS-JJO HMS-JJO changed the title Added support for disabling java specifics. [JENKINS-28924] Added support for disabling java specifics. Aug 17, 2015
@@ -61,6 +64,10 @@ public String getTestJob() {
public String getPatternFile() {
return patternFile;
}

public boolean getNoJava() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bean getters for boolean conventionally use is rather than get.

Also I would recommend inverting the sense so that the visible property is called java, though the field could keep the current sense (noJava) to make it easier to retain serialization compatibility.

@jglick
Copy link
Member

jglick commented Sep 14, 2015

Missing support for the Workflow step.

@jglick
Copy link
Member

jglick commented Sep 14, 2015

A functional test would be in order too. Minimally creating a fake test result, running the build step, verifying that the includes/excludes match some expected values.

More work, but ideally: an addition to acceptance-test-harness (example) proving that this modification actually helps to split up tests with some concrete language & test framework. JENKINS-28924 discussed VHDL, which is probably ill suited for a public automated test, but maybe there is something in Ruby etc. which this would be useful for?

@jglick
Copy link
Member

jglick commented Sep 14, 2015

Also needs merge with master.

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Nov 2, 2015

I had to take over HMS-JJO's work on this pull request. When I looked at it I realized that we needed some additions. HMS-JJO's original change outputted exclusions as test case name only, which works well with the discussed VUnit tool for VHDL code where all test case names are unique (it outputs all test cases under single class name). But for test tools where you can have multiple classes containing test cases of the same name that doesn't work very well. So I added the option to output either "testCaseName" only or "className.testCaseName". The latter is usable with the nose test framework for Python test for example where you can skip individual test cases based on that information.

I added unittests on the form that Latombe introduced in his previous change.

Did a rebase against master and forced pushed over HMS-JJO's previous commits.

}

static public boolean isEnabled(TestCaseCentricFormat format) {
return (format != null) && (format != TestCaseCentricFormat.DISABLED);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably better to have the format always be passed around as non-null, and for this to be an instance method.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did this construct to try and work around (what I thought was) serialization compatibility issues. But I'm not really a Java developer, and this is my first take on trying to contribute to a plugin, so I'm lacking some knowledge in this area.

I thought that adding a variable of type TestCaseCentricFormat to ParallelTestExecutor would make it take on a null value if an old version of of ParallelTestExecutor was de-serialized. Then it would be impossible to guarantee that the value is always non-null when used in the ParallelTestExecutor code. So by not using the TestCaseCentricFormat directly but always call isEnabled(TestCaseCentricFormat format) I could skip null checks everywhere else.

Any tips on how I can make sure format is always non-null when taking de-serialization of an older version into account (or perhaps I've just gotten things wrong and that is not an really an issue...)?

// HMS-LIG (accidentally logged in with other user)

@jglick
Copy link
Member

jglick commented Jan 19, 2016

Overall seems right, insofar as I understand it. Made some comments about the style of the API (which is also visible to scripted consumers such as Pipeline formerly known as Workflow).

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Jan 21, 2016

Changed from TestCaseCentric terminology to TestMode, amongst other things.
By letting getTestMode(), in both ParallelTestExecutor and SplitStep classes, return TestMode.JAVA if the underlying variable is null I tried to ensure that the default behaviour will be the original behaviour expected by the plugin.

@jglick
Copy link
Member

jglick commented Jan 28, 2016

By the way in #17 I have cleaned up the Docker-based demo (still trying to get a code review so I can merge it). Might have merge conflicts here but likely trivial if so. Do you have any easy examples at hand of freely downloadable tools which could take advantage of this change so that the demo can include a working non-Java job? I do not think this is a requirement but it would certainly be nice.

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Jan 29, 2016

Our main use case is the VHDL/VUnit testing as mentioned earlier which isn't lending itself to be a very easy to setup demo. We also have some Python tests using the nose test framework which supports exclusion via file so I'm investigating if that will work as an easy example. But, while experimenting, I found some weird behavior that is probably a bug in my implementation so I need to investigate that further also before this PR is finished.

@jglick
Copy link
Member

jglick commented Feb 3, 2016

OK finally got that merged, sorry for delay.

We also have some Python tests using the nose test framework which supports exclusion via file so I'm investigating if that will work as an easy example.

That would be great if it works out.

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Feb 8, 2016

I've investigated the weird behaviour I saw a bit further.

Having a pipeline job that uses parallel test executor with count driven parallelism (count 3) with
three classes with a single test each the first run will make all three slaves execute all tests.

The 2nd run constructs exclusions so that slave one runs 1 test, slave two runs 2 tests and slave
three runs 0 tests. This makes the job fail as the third job produces a JUnit result xml which is
valid but has result for 0 tests.

This behavior is not consistent, if I make a copy of the job it may very well produce exclusions
that make each slave execute one test each which is what i would expect.

This problem is nothing I've introduced but is present in older version of the plugin as well
(such as 1.7).

So for make a demo with nosetests for the docker demo I may have to either decrease the count for
count drive parallelism to 2 or increase the number of tests to work around the risk of failing
the demo.

I'm going to be out of office for a couple of weeks so I'll revisit this later and figure out what
I have to do to get the docker demo working.

@jglick
Copy link
Member

jglick commented Jul 29, 2016

This makes the job fail as the third job produces a JUnit result xml which is valid but has result for 0 tests.

Possibly just need to set some flag on the junit step to accept a result with zero test cases.

@chrono
Copy link

chrono commented Oct 11, 2016

What's the status on this? Is @HMS-LIG still working on this? I've been trying to get this plugin to work with behave junit xml and ran into issues that this PR would address, I believe.

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Oct 12, 2016

I have the intention to finish this PR, but have been swamped with other things (still are). Things kind of fell apart for me when I didn't get docker to work in Windows. We are still using the code internally, built the plug-in ourselves (old version of the plug-in, we haven't rebased when things changed). Getting time to fix this up is still at least two months away for me.

Added a test case centric option to be able to work with test cases instead
of java classes/files. A drop down box has been added where the user can select
whether the output shall be "testCase" or "className.testCaseName" in the
exclusion/inclusion files. If the select option is left at he default "Disabled"
option the regular behavior (.java/.class output) is used.
What was previously called test case centric behavior (with default
setting DISABLED) is now called test mode instead (with default setting
JAVA).
Added job that uses the newly added test mode option to output
exclusions/inclusions on test case basis instead of class basis.
The job pulls a docker image containing python and the pytest
framework that can handle exclusions/inclusion of test cases.
@HMS-LIG
Copy link
Contributor

HMS-LIG commented Dec 23, 2016

I've rebased my previous changes ontop of master. I have not done any functional changes in the actual plugin besides that.

I've also added a second pipeline job to the docker demo showing how to parallelize on test case names using python and the pytest framework. The Jenkinsfile in the repo_pytest uses a docker image from docker hub under some user's namespace, is this a valid way of implementing it? If that docker image disappears the pipeline-pytest job will fail but your original pipeline job will still build correctly.

Another commit will have to be made to add in a version of the parallel test executor plugin that actually support the test mode my new job uses. When trying the demo now I build the plugin then launch the demo and then manually update the parallel test executor plugin by uploading the newly built .hpi file in the target folder.

I've noticed a couple of problems:

The test result trend graph is not working very well, similar thing reported in https://issues.jenkins-ci.org/browse/JENKINS-18696 and https://issues.jenkins-ci.org/browse/JENKINS-39696.

The reference build used does not always contain the complete set of test cases it ran. I.e a new build, 5, uses build 4 as reference reporting it to have 80 test cases and creating exclusions/inclusions based on these 80 test cases when in fact 100 test cases were run (confirmed by looking in the junitResult.xml of build 4). This faulty collection of previous test result shows itself more often the shorter the time in between new builds. If i restart Jenkins in between each new build the test result from the reference build is always correct.

If you use exclusion pattern only the above problem means that each of your test slaves will run the missing test cases, but if you work with inclusion patterns only the single slave using the exclusion split will run the missing test cases.

@jhagege
Copy link

jhagege commented Feb 2, 2017

+1, would this PR support running mocha tests in a parallel way through nodes ?
Thanks !

@HMS-LIG
Copy link
Contributor

HMS-LIG commented Feb 3, 2017

I don't know how mocha works but if it produces JUnit result xml files and can handle excluding test cases based on testCaseName or className.testCaseName it should work. Would be nice if you could download the PR code and test it and report back here or in #21

@mkinzie
Copy link

mkinzie commented Feb 15, 2017

I've been working with the parallel-test-executor-plugin for a few days and have seen the problems that HMS-LIG mentions above (I was using the release 1.19 of the plugin, not this branch). Since this is the only place I've seen this problem mentioned in relation to running tests in parallel, I'm posting this here.

The problem with the trend graph appeared to be that it only counted what was archived the first time the junit archiver was called, so there was a race condition when run in parallel. But that just a minor part of it: this also led the parallel-test-executor-plugin to sometimes create incorrect splits (again depending on the race) giving me splits that had only dummy included and excluded, which ran everything in one split, resulting in one or more of the parallel splits failing because it didn't run anything (setting the junit plugin to accept empty results inexplicably didn't fix it).

I just got the fix in release 1.20 of the junit plugin - including the fix for https://issues.jenkins-ci.org/browse/JENKINS-39696 - and that fixed everything including my problem with it occasionally failing. It's kind of puzzling to me how this has been working for others up to now.

@chrono
Copy link

chrono commented Feb 15, 2017

IMHO the junit archiver in its current state should not be called multiple times within a pipeline. We're running a gathering stash/unstash step to collect results from parallel executions to make a single archive call.

@jglick
Copy link
Member

jglick commented Sep 28, 2017

the junit archiver in its current state should not be called multiple times within a pipeline

Works fine AFAIK using current plugin versions.

public enum TestMode {
JAVA ("Parallelize on test classes (Java)"),
TESTCASENAME ("Parallelize on test case name (Generic)"),
CLASSANDTESTCASENAME ("Parallelize on class and test case name (Generic)");
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like this is on the right track, but should be refactored into an AbstractDescribableImpl whose Descriptors with @Symbols are an ExtensionPoint, so that support for Java is not baked into anything except the Java implementation, and support for other languages could be added just by introducing new modes, potentially from other plugins.

IIUC the critical bit would be something like

public abstract Map<String, TestEntity> getData(ClassResult result);

@jglick
Copy link
Member

jglick commented Sep 28, 2017

With the advent of docker-fixtures it would even be possible to write automated tests for scenarios like Java, Python, etc. In the shorter term, presence of a Dockerized demo addition is great since it lets others see play around with the implementation easily. Not sure @abayer or I have time right now to work on it.

@tempora-mutantur
Copy link

tempora-mutantur commented Sep 22, 2019

@HMS-LIG @jglick

Hello.
Is there anything that I or somebody could help with to get this merged/implemented?
We use nunit a lot in our projects and would really want to use this plugin.
Thank you.

@jglick
Copy link
Member

jglick commented Sep 23, 2019

Hmm, I cannot say I am actively maintaining this plugin; not sure if anyone is.

@jglick jglick merged commit 95bd911 into jenkinsci:master Feb 9, 2024
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

9 participants