-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-13579][build][test-maven] Stop building the main Spark assembly. #11796
Conversation
This is probably not ready yet, although it works for me locally. I want to run the patch through jenkins (sbt and maven) to see how well it works out. |
Test build #53471 has finished for PR 11796 at commit
|
@@ -238,36 +238,13 @@ | |||
<configuration> | |||
<sources> | |||
<source>v${hive.version.short}/src/main/scala</source> | |||
<source>${project.build.directory/generated-sources/antlr</source> | |||
<source>${project.build.directory}/generated-sources/antlr</source> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice catch :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It didn't seem to be affecting the build, though. I'm actually wondering if this line is needed - the parser seems to be part of the catalyst module, not this one.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah you are right about that. So this could actually be removed. Lemme know, I can also submit a PR for this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll remove it, but since this patch may take some more iterations before it's ready, feel free to do it as part of another patch too.
Test build #53473 has finished for PR 11796 at commit
|
Test build #53472 has finished for PR 11796 at commit
|
Test build #53549 has finished for PR 11796 at commit
|
Test build #53576 has finished for PR 11796 at commit
|
Hmm, log files are gone and tests pass locally... sigh. retest this please |
Test build #53683 has finished for PR 11796 at commit
|
Test build #53735 has finished for PR 11796 at commit
|
These tests fail sporadically; I think it's some weird sbt dependency resolution issue that's causing different hadoop versions to get mixed up. e.g.
That's not the whole stack trace, but there's only hadoop stuff in there, so it's not Spark calling some method that was removed. |
The offending method ( |
Test build #53800 has finished for PR 11796 at commit
|
Weird jline error, trying to see if I can figure it out... but the tests I want to fail passed. retest this please. |
jline error looks weirdly similar. Old version of jline (0.9) has class
|
Test build #53818 has finished for PR 11796 at commit
|
I'm a little at a loss at what's wrong. Copy & pasting the commands from the logs and running them on the same jenkins machine works (I don't get that exception). Running |
Test build #53874 has finished for PR 11796 at commit
|
Test build #53871 has finished for PR 11796 at commit
|
Test build #53875 has finished for PR 11796 at commit
|
Test build #53948 has finished for PR 11796 at commit
|
Test build #53949 has finished for PR 11796 at commit
|
yay, but I'm still worried about the previous failures. will test on maven now. |
retest this please |
the maven build seems stuck... :-/ I'll revert to sbt and add some patches since I want to try to hit the previous failures. |
@@ -201,24 +201,29 @@ class FileAppenderSuite extends SparkFunSuite with BeforeAndAfter with Logging { | |||
|
|||
// Make sure only logging errors |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's the deal with these logging changes? Got a short summary?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For some reason these tests failed in one of my runs. And when they failed, they used to leave the error level at "ERROR", which basically means all tests that ran after these didn't write any logs.
Took a quick pass and have a few clarifying questions. This looks like it's in good shape overall, though. |
Test build #54652 has finished for PR 11796 at commit
|
@JoshRosen if you want to take a look at the above failure, it seems related to running the build with jdk8. retest this please |
Hmm, that's unfortunate. Let me flip the switch to move the PR builder back to Java 7... |
Jenkins, retest this please. |
Test build #54656 has finished for PR 11796 at commit
|
Test build #54657 has finished for PR 11796 at commit
|
Hi @JoshRosen , any more comments? |
Test build #54865 has finished for PR 11796 at commit
|
if [ -f "${SPARK_HOME}/RELEASE" ]; then | ||
SPARK_JARS_DIR="${SPARK_HOME}/lib" | ||
SPARK_JARS_DIR="${SPARK_HOME}/jars" | ||
else | ||
SPARK_JARS_DIR="${SPARK_HOME}/assembly/target/scala-$SPARK_SCALA_VERSION" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you also have to append /jars
here?
LGTM pending Jenkins. I'll merge this after tests pass and after I wrap up a final local test with the I'd like to get these core changes in now in order to unblock other build patches and so that they get more widespread community testing. If necessary, we can address small issues in followup patches. |
Test build #54892 has finished for PR 11796 at commit
|
I also manually ran |
How to build Spark after this patch? Just run
|
@zsxwing, try |
Yeah it should be "package" now. I'll send a message to the dev list. |
yeah, |
This change modifies the "assembly/" module to just copy needed
dependencies to its build directory, and modifies the packaging
script to pick those up (and remove duplicate jars packages in the
examples module).
I also made some minor adjustments to dependencies to remove some
test jars from the final packaging, and remove jars that conflict with each
other when packaged separately (e.g. servlet api).
Also note that this change restores guava in applications' classpaths, even
though it's still shaded inside Spark. This is now needed for the Hadoop
libraries that are packaged with Spark, which now are not processed by
the shade plugin.