-
-
Notifications
You must be signed in to change notification settings - Fork 307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test_openjdk8_hs_sanity.openjdk_x86-64_linux unstable everynight #1750
Comments
I kicked off a set of grinders to see if I could get a baseline for the failures we've been seeing above but it looks like most of them passed?!
|
Only |
Last night's x-linux run was on Regarding,
This is basically the same issue as adoptium/infrastructure#1145. |
As for the intermittents, I'm not 100% sure what's going on here. For one, the tests I'm seeing in the nightlies seem pretty consistant. Example:
But this doesn't seem reproducible in Grinders, even when running on the same machine:
I will running another grinder, matching the same JDK version from https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_hs_sanity.openjdk_x86-64_linux/331/consoleFull to see if I can get a result from that |
Ok so I managed to reproduce the other errors. https://ci.adoptopenjdk.net/view/Test_grinder/job/Grinder/2945/ The reason I wasn't able to reproduce this before was because I was just running these tests against the normal nightly builds, where these tests are passing. However, the cause of these failures is from the https://ci.adoptopenjdk.net/job/build-scripts/job/jobs/job/jdk8u/job/jdk8u-linux-x64-hotspot-jfr/ job. |
re: #1750 (comment) leads me to ask a couple of questions:
|
The set up configuration for jfr on jdk8 is here which points to our jfr repo (makes sense).
It's wise to assume that we are not using tip to build this, especially since the last commit was back in September 😬.
That depends on our machine capacity, manpower able to triage and amount of experts with jfr I guess.
Nightly builds yes. They are passed in as one of our xlinux builds, the same way as openj9 and correto are. As for release, I don't think so. Looking at a recent April release, I don't see any reference to jfr. Nor can I see any reference to enabling it by default in the build-scripts. It's odd that we build it in the nightlies but don't appear to regularly release it. |
Kicking off some more grinders with a jfr url attached reveals a lot more light on the issues seen above. On both ubuntu and centos machines, I'm seeing 9 failures:
These failures at the very least should be triaged seeing as they're |
PR to exclude tests affected by UnknownHostExceptions |
Summary of JFR failures
@smlambert Since it's possible that some of these failures are not jfr related and maybe machine related, would you prefer if I opened up a separate issue for each testcase or thrown them all into one issue and work through them one by one? |
Let's keep it as one issue and go through each, if its discovered as part of that exercise that there is unique/separate issue causing a subset of the failures, we can spawn off separate issues as they are discovered. |
Ok, starting with |
java/lang/Class/getDeclaredField/FieldSetAccessibleTest.java
The affected classes are all from the jdk/jfr/events directory. However, if you look at the Output from recent Grinder: openjdk_test_output.tar.gz |
To summarize, I believe many of the failures listed above are a result of test material mismatched (as the jfr build was not passing its repo info to the test pipeline, so we would assume to pull test material from jdk8u, which is not matched to the jfr mirror. Seeing as AdoptOpenJDK/TSC#166 has been approved, triaging tests for builds that will be removed is an activity that is no longer needed. |
Describe the bug
https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_hs_sanity.openjdk_x86-64_linux has been failing on a near nightly basis with various test failures that seem to differ everynight and between machines. The purpose of this issue is to document these failures and identify a pattern across machines and nightly versions.
22/04/2020
https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_hs_sanity.openjdk_x86-64_linux/324/ on
test-godaddy-ubuntu1604-x64-4
23/04/2020
https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_hs_sanity.openjdk_x86-64_linux/325/ on
test-godaddy-ubuntu1604-x64-4
24/04/2020
https://ci.adoptopenjdk.net/view/Test_openjdk/job/Test_openjdk8_hs_sanity.openjdk_x86-64_linux/326/ on
test-godaddy-centos7-x64-2
The text was updated successfully, but these errors were encountered: