Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SPARK-6182 [BUILD] spark-parent pom needs to be published for both 2.10 and 2.11 #4913

Closed
wants to merge 1 commit into from

Conversation

srowen
Copy link
Member

@srowen srowen commented Mar 5, 2015

Option 2 of 2: Express all module dependencies with hard-coded _2.10 or _2.11, not scala.binary.version

@SparkQA
Copy link

SparkQA commented Mar 5, 2015

Test build #28294 has started for PR 4913 at commit c1483ff.

  • This patch merges cleanly.

@AmplabJenkins
Copy link

Test FAILed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28294/
Test FAILed.

@shaneknapp
Copy link
Contributor

i will retrigger this test once we're done w/our maintenance.

@shaneknapp
Copy link
Contributor

jenkins, test this please

@SparkQA
Copy link

SparkQA commented Mar 5, 2015

Test build #28296 has started for PR 4913 at commit c1483ff.

  • This patch merges cleanly.

@vanzin
Copy link
Contributor

vanzin commented Mar 5, 2015

With this change is scala.binary.version even needed anymore?

Also, how does it work in the end? It may work, but I see some weirdness here. For example, you'll still be publishing a single spark-parent pom, right? That pom still declares some dependencies with the now hardcoded _2.10 version, such as com.twitter:chill. If no one uses those directly things should work, but it someone ends up using it, and they want Scala 2.11, they'll probably get the wrong version, no?

@srowen
Copy link
Member Author

srowen commented Mar 5, 2015

scala.binary.version is then only used to form paths like target/scala-${scala.binary.version} in the build.

Actually, you're right that I overlooked that the parent POM still has to change in this version too for reasons like the chill dep. Yeah we'd still have to publish two parent POMs.

I can update this if people still opt for this version; now I'm not so sure if it's not saving the parent POM divergence.

@SparkQA
Copy link

SparkQA commented Mar 5, 2015

Test build #28296 has finished for PR 4913 at commit c1483ff.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/28296/
Test PASSed.

@pwendell
Copy link
Contributor

pwendell commented Mar 5, 2015

I don't think this works in its current form because the artifact names are laid out in the parent pom and those can still only have a single version. The only way I think we could preserve having a single parent pom is to simply not declare any scala dependencies in the parent pom and only declare them inline in each child pom, which is generally bad form. So given that #4912 is simpler, I'm inclined to just pull that one in.

@srowen srowen closed this Mar 5, 2015
@srowen srowen deleted the SPARK-6182.2 branch March 5, 2015 22:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
6 participants