Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Modules were resolved with conflicting cross-version suffixes" #336

Open
froocpu opened this issue Jan 19, 2021 · 9 comments
Open

"Modules were resolved with conflicting cross-version suffixes" #336

froocpu opened this issue Jan 19, 2021 · 9 comments

Comments

@froocpu
Copy link

froocpu commented Jan 19, 2021

I get the following error when I try to import dependencies with the following in my build.sbt:

(update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, com.chuusai:shapeless, org.apache.spark:spark-sketch, org.apache.spark:spark-kvstore, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.apache.spark:spark-sql, org.scala-lang.modules:scala-xml, org.json4s:json4s-jackson, org.typelevel:macro-compat, com.fasterxml.jackson.module:jackson-module-scala, org.scalanlp:breeze-macros, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.typelevel:machinist, org.json4s:json4s-scalap, org.scala-lang.modules:scala-parser-combinators, org.scalanlp:breeze, org.apache.spark:spark-tags, org.apache.spark:spark-core, org.apache.spark:spark-network-common

scalaVersion := "2.12.12"
val sparkVersion = "3.0.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12"

However, it does compile properly when I provide the following configurations:

scalaVersion := "2.11.8"
val sparkVersion = "2.4.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12"

I'm building using sbt version 1.2.8 and I've tried importing these dependencies with other versions of sbt, too. Behaviour also persists if you use the sbt plugin or run sbt from the terminal directly.

I've also tried deleting invalidating caches and restarting the IDE.

IntelliJ Community 2019.2.4
macOS Mojave 10.14.6

@hedibejaoui
Copy link

@froocpu I was able to solve this error by excluding some of Deequ's transitive libraries that are cross-compiled with Scala 2.11:

libraryDependencies += ("com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12")
        .exclude("org.scalanlp", "breeze_2.11")
        .exclude("com.chuusai", "shapeless_2.11")
        .exclude("org.apache.spark", "spark-core_2.11")
        .exclude("org.apache.spark", "spark-sql_2.11")

@leopasta
Copy link

Although that compiled fine, our tests that rely on RelativeRateOfChangeStrategy fails with no class found for breeze:

[info]   java.lang.NoClassDefFoundError: breeze/linalg/DenseVector$
[info]   at com.amazon.deequ.anomalydetection.BaseChangeStrategy.detect(BaseChangeStrategy.scala:90)
[info]   at com.amazon.deequ.anomalydetection.BaseChangeStrategy.detect$(BaseChangeStrategy.scala:80)
[info]   at com.amazon.deequ.anomalydetection.RelativeRateOfChangeStrategy.detect(RelativeRateOfChangeStrategy.scala:36)
[info]   at com.amazon.deequ.anomalydetection.AnomalyDetector.detectAnomaliesInHistory(AnomalyDetector.scala:98)
[info]   at com.amazon.deequ.anomalydetection.AnomalyDetector.isNewPointAnomalous(AnomalyDetector.scala:60)
[info]   at com.amazon.deequ.checks.Check$.isNewestPointNonAnomalous(Check.scala:1126)
[info]   at com.amazon.deequ.checks.Check.$anonfun$isNewestPointNonAnomalous$1(Check.scala:433)
[info]   at scala.runtime.java8.JFunction1$mcZD$sp.apply(JFunction1$mcZD$sp.java:23)
[info]   at com.amazon.deequ.constraints.AnalysisBasedConstraint.runAssertion(AnalysisBasedConstraint.scala:108)
[info]   at com.amazon.deequ.constraints.AnalysisBasedConstraint.pickValueAndAssert(AnalysisBasedConstraint.scala:74)
[

@hedibejaoui
Copy link

@leopasta Try adding the breeze library explicitly as a test dependency.

@nathan-bennett
Copy link

nathan-bennett commented Feb 3, 2021

I ran into a similar issue when using gradle. I just excluded the spark dependency when compiling deequ:

    compile(group: 'com.amazon.deequ', name: 'deequ', version: '1.1.0_spark-3.0-scala-2.12'){
        exclude group: 'org.apache.spark', module: 'spark-core_2.11'
        exclude group: 'org.apache.spark', module: 'spark-sql_2.11'
    }

@piotrm0
Copy link

piotrm0 commented Feb 25, 2021

On a related note, does anyone know why the 1.1.0 release which even has "scala-2.12" in its name depends on spark for scala-2.11 on maven as in here: https://mvnrepository.com/artifact/com.amazon.deequ/deequ/1.1.0_spark-3.0-scala-2.12 .

@lange-labs
Copy link
Contributor

Closing due to inactivity - please reopen if issues are remaining with the latest version.

@apython1998
Copy link

This is still an issue for me as well. Unable to compile 1.1 with my spark 3.0 2.12 app

@fernanluyano
Copy link

Same issue today. Exclusions are not working for me as suggested above. Please reopen

@lange-labs lange-labs reopened this Aug 5, 2021
@Mehdi-Bendriss
Copy link

Mehdi-Bendriss commented Aug 5, 2021

In my case (scala 2.12.13 and spark 3.1.1), I did not need to exclude anything, I simply upgraded the deequ dependency to the latest 2.12 built deequ version:

scalaVersion := "2.12.13"

....

"com.amazon.deequ" % "deequ" % "1.2.2-spark-3.0",

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants