Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates to support Java 11 on 1.9 branch #1295

Merged
merged 9 commits into from
Jul 29, 2019
Merged

Conversation

ctubbsii
Copy link
Member

No description provided.

@ctubbsii ctubbsii self-assigned this Jul 25, 2019
@ctubbsii
Copy link
Member Author

All the unit tests are passing, but I haven't yet tested all the ITs with this change.

(backport of ec76015)

* Update maven-javadoc-plugin
* Drop `<javadocVersion>` from maven-javadoc-plugin configuration
* Replace `<tt>` tags with `<code>`, since the former is no longer supported
* Add `<caption>` for tables (weird Java 11 javadoc requirement)
ctubbsii added 5 commits July 26, 2019 15:30
Partial backport of 0e89ab2 to the 1.9
branch:

* Make ShellServerIT.classpath test handle output under Java 11
* Convert dubious DynamicThreadPoolsIT from an IT to a unit test
  (TabletServerResourceManagerDynamicCompactionPoolTest)
* Give SimpleGarbageCollector 16MB of memory instead of 10MB for its
  gcLotsOfCandidatesIT test, so it doesn't crash under Java 11 before the
  test runs (Java 11 needs slightly more base memory, it appears)
Also update compiler compliance settings in Eclipse, in the Eclipse
profile.
@ctubbsii
Copy link
Member Author

With these changes, all the ITs pass, but only when using the Hadoop 3 profile. The Hadoop 2 profile still results in some test failures, particularly those involving Kerberos. I haven't been able to determine why yet, but likely Hadoop fixed some JDK9+ compatibility issue in their Kerberos code in 3.0.0 and later. I tried the latest versions of 2.6-2.9 (2.6.5, 2.7.7, 2.8.5, and 2.9.2), and the KerberosIT failed with all, but the 3.0.0 does work. Perhaps @joshelser has some insight into the Kerberos changes in Hadoop 3 that could explain why Kerberos tests fail with Hadoop 2 and Java 11?

To get the ITs to pass on JDK 11, I can add Assume rules to the failing tests, in order to ensure JDK8 || hadoop.version >= 3, so they will be skipped otherwise. However, doing so would make these tests permanently useless for the default Hadoop 2 profile, if we were to apply #1236 to the 1.9 branch.

@EdColeman
Copy link
Contributor

I tried to build with jdk-11 from the 1.9 branch (thought this had been merged) and received javadoc errors.

[ERROR]` Failed to execute goal org.apache.maven.plugins:maven-javadoc-plugin:3.0.1:jar (default-cli) on project accumulo-fate: MavenReportException: Error while generating Javadoc: 
[ERROR] Exit code: 1 - javadoc: error - The code being documented uses modules but the packages defined in https://docs.oracle.com/javase/8/docs/api/ are in the unnamed module.

I could get pass that by adding

<source>${maven.compiler.source}</source>

to the maven-javadoc-plugin.

After that I hit the tt / @code issues and started digging and found this.

Not sure if you want to consider adding that to the pom or not.

@ctubbsii
Copy link
Member Author

I'm pretty sure I've already fixed the javadoc issues in this PR.

@ctubbsii
Copy link
Member Author

Forcing use of minikdc 3.0.0, even with the Hadoop 2 profile, seems to get KerberosIT passing. I'll run all the ITs with that to see if it fixes the issue generally.

@ctubbsii
Copy link
Member Author

I think the reason why 3.0.0 works is because minikdc in 3.0.0 is a wrapper around Kerby, instead of using the outdated ApacheDS Kerberos libs (see https://issues.apache.org/jira/browse/HADOOP-12911 and possibly apache/kafka#5586)

@ctubbsii ctubbsii marked this pull request as ready for review July 29, 2019 05:23
@ctubbsii
Copy link
Member Author

All ITs pass now for every combination of Hadoop {2.6.5, 3.0.0} and OpenJDK {8, 11}.

@ctubbsii ctubbsii merged commit fd4d478 into apache:1.9 Jul 29, 2019
@ctubbsii ctubbsii deleted the 1.9-java11 branch July 29, 2019 18:50
@vlsi
Copy link

vlsi commented Jul 29, 2019

This multiplies build matrix by two, so effectively you use 4 jobs for 30 minutes each.

Note: Travis have only 15 jobs for all Apache projects, so it would be nice if you could either improve job execution time or avoid testing unrelated combinations.
For instance, you might test test java8+hadoop2 and java11+hadoop3.

Or perform hadoop2/hadoop3 test only for the modules that are related to hadoop.
For instance:

Apache Accumulo Documentation ...................... SUCCESS [  9.201 s]
Apache Accumulo Maven Plugin ....................... SUCCESS [ 18.468 s]

Those projects do not seem to depend on hadoop version, however those take 4 executors for 30 seconds.

Here's an example: https://travis-ci.org/apache/accumulo/builds/565122883

@ctubbsii
Copy link
Member Author

@vlsi We will be eliminating openjdk8 builds soon (see #1236).

@keith-turner
Copy link
Contributor

@vlsi is this causing a problem for other Apache projects? If so, do you know where to find more info about Travis CI resources available?

@vlsi
Copy link

vlsi commented Jul 29, 2019

is this causing a problem for other Apache projects?

I'm afraid it does.

I'm developing Gradle build scripts for for Apache JMeter / Apache Calcite, and I experience delays.
It is not that your project is the only one that consumes Travis resources, but it was executing while I was waiting for my build to even start (I was looking at https://travis-ci.org/apache ).

Build times are fast for my personal projects at the very same time when I experience delays for Apache-related Travis builds, so I related that to the "global limit" for "all Apache projects" at Travis.

  1. I can't remember where I took "15 jobs for all Apache projects". It might be I've confused it with something.

  2. Committers are recommended to subscribe to builds@apache.org

Here Greg says that Travis resources can't be increased: https://lists.apache.org/thread.html/5f51f41d705c4a7bcdba35ca7ab07c56c08336ce8026d17f7ed230ca@%3Cbuilds.apache.org%3E

Note: accumulo is not in the "top consumers" list ( yet :) )

We will be eliminating openjdk8 builds soon

Nice.

@ctubbsii
Copy link
Member Author

I changed the per-repo setting in GitHub to limit concurrent jobs to 2 for apache/accumulo. Hopefully that helps. Our builds aren't typically that time-sensitive anyway.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
No open projects
Status: Done
Development

Successfully merging this pull request may close these issues.

4 participants