[SPARK-4429][BUILD] Build for Scala 2.11 using sbt fails.#3342
[SPARK-4429][BUILD] Build for Scala 2.11 using sbt fails.#3342ueshin wants to merge 3 commits intoapache:masterfrom
Conversation
…xists instead of adding profile.
|
Test build #23545 has started for PR 3342 at commit
|
|
Test build #23545 has finished for PR 3342 at commit
|
|
Test PASSed. |
There was a problem hiding this comment.
Why do we no longer need to turn the scala-2.10 profile on in this case? Will Maven turn it on automatically?
There was a problem hiding this comment.
Yes, the profile scala-2.10 is activated by the condition:
<property><name>!scala-2.11</name></property>
so Maven will turn it on if the property scala-2.11 is null or empty.
|
Test build #23559 has started for PR 3342 at commit
|
|
Test build #23559 has finished for PR 3342 at commit
|
|
Test PASSed. |
|
Hey @ueshin one question. If the PropertyProfileActivator is not triggered by blank values, then why does the maven build work when you run |
|
I see - so the properties are re-written earlier on. Good find! |
project/SparkBuild.scala
Outdated
There was a problem hiding this comment.
Could you add a comment here and link to the maven code this is replicating? It might be tricky for other developers to understand why this exists.
|
I didn't test this locally, but if this works, this is a good way of doing it. |
|
Test build #23601 has started for PR 3342 at commit
|
|
Test build #23601 has finished for PR 3342 at commit
|
|
Test PASSed. |
|
Thanks @ueshin - I will pull this in. |
I tried to build for Scala 2.11 using sbt with the following command: ``` $ sbt/sbt -Dscala-2.11 assembly ``` but it ends with the following error messages: ``` [error] (streaming-kafka/*:update) sbt.ResolveException: unresolved dependency: org.apache.kafka#kafka_2.11;0.8.0: not found [error] (catalyst/*:update) sbt.ResolveException: unresolved dependency: org.scalamacros#quasiquotes_2.11;2.0.1: not found ``` The reason is: If system property `-Dscala-2.11` (without value) was set, `SparkBuild.scala` adds `scala-2.11` profile, but also `sbt-pom-reader` activates `scala-2.10` profile instead of `scala-2.11` profile because the activator `PropertyProfileActivator` used by `sbt-pom-reader` internally checks if the property value is empty or not. The value is set to non-empty value, then no need to add profiles in `SparkBuild.scala` because `sbt-pom-reader` can handle as expected. Author: Takuya UESHIN <ueshin@happy-camper.st> Closes #3342 from ueshin/issues/SPARK-4429 and squashes the following commits: 14d86e8 [Takuya UESHIN] Add a comment. 4eef52b [Takuya UESHIN] Remove unneeded condition. ce98d0f [Takuya UESHIN] Set non-empty value to system property "scala-2.11" if the property exists instead of adding profile. (cherry picked from commit f9adda9) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
I tried to build for Scala 2.11 using sbt with the following command:
but it ends with the following error messages:
The reason is:
If system property
-Dscala-2.11(without value) was set,SparkBuild.scalaaddsscala-2.11profile, but alsosbt-pom-readeractivatesscala-2.10profile instead ofscala-2.11profile because the activatorPropertyProfileActivatorused bysbt-pom-readerinternally checks if the property value is empty or not.The value is set to non-empty value, then no need to add profiles in
SparkBuild.scalabecausesbt-pom-readercan handle as expected.