Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Confusing compilation errors in a project matrix #75

Open
colindean opened this issue Sep 19, 2022 · 4 comments
Open

Confusing compilation errors in a project matrix #75

colindean opened this issue Sep 19, 2022 · 4 comments

Comments

@colindean
Copy link

I aim to build a Spark 2.x app for Scalas 2.11, 2.12, and 2.13 with a set of dependencies for each on top of the base dependencies.

My configuration seems to allow the projects for 2.11 and 2.12 to build and test (e.g. sbt root2_11/test) correctly while the 2.13 has some errors still (working on those). However, there seems to be another project that's also building, and it can't find any dependencies, so it errors when I run sbt test as CI does.

I think what's happening is that the root project still thinks it should compile when I think I only want the projects in the specified matrix jvmPlatform declarations to be active. I think I need to disable this root project somehow, but I can't find a way to do that.

I'd welcome some pointers in the right direction. I'm so close to getting this cross-version build to work!

@colindean
Copy link
Author

colindean commented Sep 19, 2022

You can see a build begin to fail here. Note the inclusion of scopt in the commonSettings declared earlier in the build.sbt.

@colindean
Copy link
Author

I'll also note that this setup has absolutely wrecked the IntelliJ project. IntelliJ appears to be importing the primary project that is erroring. This is also the first time I've used project matrix with IntelliJ so pointers there are welcome, too.

@eed3si9n
Copy link
Member

Your build has 3 jvmPlatform(...) rows. I don't think you can do that since a row needs to have a unique virtual axis, which can represent JVM, JS, custom-axis-for-libraries etc. The Scala versions are columns so to speak.
Please try consolidating your rows into one and do the usual pattern matching on scalaBinaryVersion within it, and see if that fixes your issues.

@colindean
Copy link
Author

Playing with this in a very small window of time just now…

I wanted to do something like this, but scalaBinaryVersion.value isn't available where I've tried to use it:

  .jvmPlatform(
    scalaVersions = Seq(scala211, scala212, scala213),
    settings = scalaBinaryVersion.value match {
      case "2.11" =>
        Seq(
          circeVersion := "0.11.2",
          circeYamlVersion := "0.10.1",
          libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.3.4" % Provided,
          (Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
          generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
        )
      case "2.12" =>
        Seq(
          circeVersion := "0.14.2",
          circeYamlVersion := "0.14.1",
          libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.8" % Provided
        )
      case "2.13" =>
        Seq(
          circeVersion := "0.14.2",
          circeYamlVersion := "0.14.1",
          libraryDependencies += "org.apache.spark" %% "spark-sql" % "3.2.1" % Provided
        )
    }
  )

I got sbt to run with this, but it has the same problem as my previous solution:

lazy val root = (projectMatrix in file("."))
  .enablePlugins(BuildInfoPlugin)
  .settings(commonSettings)
  .jvmPlatform(
    scalaVersions = Seq(scala211, scala212, scala213),
    settings = Seq(
      circeVersion := (scalaBinaryVersion.value match {
        case "2.11" => "0.11.2"
        case "2.12" | "2.13" => "0.14.2"
      }),
      circeYamlVersion := (scalaBinaryVersion.value match {
        case "2.11" => "0.10.1"
        case "2.12" | "2.13" => "0.14.1"
      }),
      libraryDependencies ++= (scalaBinaryVersion.value match {
        case "2.11" => Seq("org.apache.spark" %% "spark-sql" % "2.3.4" % Provided)
        case "2.12" => Seq("org.apache.spark" %% "spark-sql" % "2.4.8" % Provided)
        case "2.13" => Seq("org.apache.spark" %% "spark-sql" % "3.2.1" % Provided)
      }),
      //(Compile / runMain) := Defaults.runMainTask(Compile / fullClasspath, Compile / run / runner).evaluated,
      //generateTestData := { (Compile / runMain).toTask(" com.target.data_validator.GenTestData").value }
    )
  )

I feel like I'm getting warmer…

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants