New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ADAM-752] Build for many combos of Spark/Hadoop versions. #765

Merged
merged 3 commits into from Aug 12, 2015

Conversation

Projects
None yet
4 participants
@fnothaft
Member

fnothaft commented Aug 10, 2015

Resolves build comments on #752. I have set Jenkins up as a 3D!!!!!! matrix:

screen shot 2015-08-10 at 4 32 28 pm

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 10, 2015

Member

OK, so a few interesting things.

  • We can't build for Spark 1.1.1, which I had forgotten (repartitionAndSortWithinPartitions doesn't exist)
  • It seems like the Spark-based unit tests will only run in parallel on Spark 1.2.0. Not sure what is broken in Spark 1.3+? I've told Jenkins to just run builds sequentially, for now.
Member

fnothaft commented Aug 10, 2015

OK, so a few interesting things.

  • We can't build for Spark 1.1.1, which I had forgotten (repartitionAndSortWithinPartitions doesn't exist)
  • It seems like the Spark-based unit tests will only run in parallel on Spark 1.2.0. Not sure what is broken in Spark 1.3+? I've told Jenkins to just run builds sequentially, for now.
Show outdated Hide outdated scripts/jenkins-test
@ryan-williams

This comment has been minimized.

Show comment
Hide comment
@ryan-williams

ryan-williams Aug 10, 2015

Member

That looks pretty neat. I'm guessing there's some config changes you are making in Jenkins that are not depicted here that e.g. populate $SPARK_VERSION?

Member

ryan-williams commented Aug 10, 2015

That looks pretty neat. I'm guessing there's some config changes you are making in Jenkins that are not depicted here that e.g. populate $SPARK_VERSION?

Show outdated Hide outdated scripts/jenkins-test
Show outdated Hide outdated scripts/jenkins-test
@massie

This comment has been minimized.

Show comment
Hide comment
@massie

massie Aug 10, 2015

Member

This is good stuff, Frank.

Member

massie commented Aug 10, 2015

This is good stuff, Frank.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 10, 2015

Member

Jenkins, retest this please.

Member

fnothaft commented Aug 10, 2015

Jenkins, retest this please.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

Fun times! Guess what doesn't work?

If your guess was that running:

mvn package -Dspark.version=1.4.1

Would lead to Destruction, Terror, and Mayhem, you win $5! At least, it does for me locally.

I will sort this out later, but if anyone has seen this before or has any clues, I would love your thoughts.

Member

fnothaft commented Aug 11, 2015

Fun times! Guess what doesn't work?

If your guess was that running:

mvn package -Dspark.version=1.4.1

Would lead to Destruction, Terror, and Mayhem, you win $5! At least, it does for me locally.

I will sort this out later, but if anyone has seen this before or has any clues, I would love your thoughts.

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Aug 11, 2015

Member

Destruction, Terror, and Mayhem and a 45 minute wait to see if your pull request turns green. :)

Maybe a middle ground, or set up a quick build in Travis and leave Jenkins 3D Awesome for a nightly integration test?

Would a profile work for where the -Dspark-version is failing?

Member

heuermh commented Aug 11, 2015

Destruction, Terror, and Mayhem and a 45 minute wait to see if your pull request turns green. :)

Maybe a middle ground, or set up a quick build in Travis and leave Jenkins 3D Awesome for a nightly integration test?

Would a profile work for where the -Dspark-version is failing?

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

OK, so what seems to have been the problem is that we had a completely unused (???) dependency on com.amazonaws:aws-java-sdk, which was bringing in a version of com.fasterxml.jackson.core:jackson-core that was incompatible with Spark 1.3.0 and higher. As a result, this was giving a red herring test failure message that implied that the unit tests were crashing because we were running multiple SparkContexts in parallel. That unused dependency is gone now, so hopefully the build should pass now (and we don't need to run all the builds sequentially, which is good for obvious reasons).

Member

fnothaft commented Aug 11, 2015

OK, so what seems to have been the problem is that we had a completely unused (???) dependency on com.amazonaws:aws-java-sdk, which was bringing in a version of com.fasterxml.jackson.core:jackson-core that was incompatible with Spark 1.3.0 and higher. As a result, this was giving a red herring test failure message that implied that the unit tests were crashing because we were running multiple SparkContexts in parallel. That unused dependency is gone now, so hopefully the build should pass now (and we don't need to run all the builds sequentially, which is good for obvious reasons).

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

And the Spark 1.4.1/Hadoop 2.6.0 touchstone builds have passed! Huzzah! Now, for the rest of the builds.

Member

fnothaft commented Aug 11, 2015

And the Spark 1.4.1/Hadoop 2.6.0 touchstone builds have passed! Huzzah! Now, for the rest of the builds.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

Rebased and added a commit to clean up log junk when running 1.4.1 unit tests.

Member

fnothaft commented Aug 11, 2015

Rebased and added a commit to clean up log junk when running 1.4.1 unit tests.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

Cleaned up RE: the comments above around version checking.

Member

fnothaft commented Aug 11, 2015

Cleaned up RE: the comments above around version checking.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

Jenkins, retest this please.

Member

fnothaft commented Aug 11, 2015

Jenkins, retest this please.

Show outdated Hide outdated pom.xml
@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 11, 2015

Member

Jenkins, retest this please.

Member

fnothaft commented Aug 11, 2015

Jenkins, retest this please.

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 12, 2015

Member

IT WORKS! IT WORKS! IT REALLY DOES!!!!!!!
;)

Member

fnothaft commented Aug 12, 2015

IT WORKS! IT WORKS! IT REALLY DOES!!!!!!!
;)

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Aug 12, 2015

Member

Are all the red Failed - skipped in build 842 supposed to be Not run? Looks like they might be combinations that are not supported, such as Hadoop 1 with Spark 1.4.1. Or maybe I just need my 3D glasses?

Member

heuermh commented Aug 12, 2015

Are all the red Failed - skipped in build 842 supposed to be Not run? Looks like they might be combinations that are not supported, such as Hadoop 1 with Spark 1.4.1. Or maybe I just need my 3D glasses?

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 12, 2015

Member

@heuermh correct; Spark 1.4.1/Hadoop 1.x is skipped as it won't build with the current way that Spark is packaged as maven artifacts (there was a long discussion of this on a previous ADAM pr).

Member

fnothaft commented Aug 12, 2015

@heuermh correct; Spark 1.4.1/Hadoop 1.x is skipped as it won't build with the current way that Spark is packaged as maven artifacts (there was a long discussion of this on a previous ADAM pr).

@fnothaft

This comment has been minimized.

Show comment
Hide comment
@fnothaft

fnothaft Aug 12, 2015

Member

They show up as greyed out red because the last build of that combo failed.

Member

fnothaft commented Aug 12, 2015

They show up as greyed out red because the last build of that combo failed.

@heuermh

This comment has been minimized.

Show comment
Hide comment
@heuermh

heuermh Aug 12, 2015

Member

From what I can see it looks like a 15 minute build time up from 10 minutes, not too bad. +1

Member

heuermh commented Aug 12, 2015

From what I can see it looks like a 15 minute build time up from 10 minutes, not too bad. +1

@massie massie merged commit 6cade81 into bigdatagenomics:master Aug 12, 2015

1 check passed

default Merged build finished.
Details
@massie

This comment has been minimized.

Show comment
Hide comment
@massie

massie Aug 12, 2015

Member

Nice addition, Frank!

Member

massie commented Aug 12, 2015

Nice addition, Frank!

@heuermh heuermh referenced this pull request Aug 19, 2015

Closed

Bump Spark version to 1.4 #752

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment