New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK-6182 [BUILD] spark-parent pom needs to be published for both 2.10 and 2.11 #4913
Conversation
…cala.binary.version
Test build #28294 has started for PR 4913 at commit
|
Test FAILed. |
i will retrigger this test once we're done w/our maintenance. |
jenkins, test this please |
Test build #28296 has started for PR 4913 at commit
|
With this change is Also, how does it work in the end? It may work, but I see some weirdness here. For example, you'll still be publishing a single spark-parent pom, right? That pom still declares some dependencies with the now hardcoded |
Actually, you're right that I overlooked that the parent POM still has to change in this version too for reasons like the I can update this if people still opt for this version; now I'm not so sure if it's not saving the parent POM divergence. |
Test build #28296 has finished for PR 4913 at commit
|
Test PASSed. |
I don't think this works in its current form because the artifact names are laid out in the parent pom and those can still only have a single version. The only way I think we could preserve having a single parent pom is to simply not declare any scala dependencies in the parent pom and only declare them inline in each child pom, which is generally bad form. So given that #4912 is simpler, I'm inclined to just pull that one in. |
Option 2 of 2: Express all module dependencies with hard-coded _2.10 or _2.11, not scala.binary.version