Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = io.netty#netty;3.6.2.Final #139

Closed
borisclemencon opened this issue Sep 15, 2016 · 8 comments

Comments

@borisclemencon
Copy link
Contributor

Hi,

I get an execption when sbt compile with

import com.typesafe.sbt.SbtGit.git

// git versioning
git.baseVersion := "0.0.0"

git.useGitDescribe := true

lazy val root = (project in file("."))
  .enablePlugins(GitVersioning)
  .settings(
    name := "myproject",
    scalaVersion := "2.11.8",
    dependencyOverrides ++= Set(
      "org.scala-lang" % "scala-compiler" % "2.11.8",
      "jline" % "jline" % "2.12.1"
    ),
    libraryDependencies ++= Seq(
      "joda-time" % "joda-time" % "2.9.1" % "compile",
      "com.amazonaws" % "aws-java-sdk-core" % "1.11.33",
      "org.scalaj" %% "scalaj-http" % "2.3.0" % Compile,
      "net.liftweb" %% "lift-json" % "2.6.2" % Compile,
      "com.amazon.redshift" % "jdbc42.Driver" % "1.1.17.1017" from "https://s3.amazonaws.com/redshift-downloads/drivers/RedshiftJDBC42-1.1.17.1017.jar",
      "org.apache.spark" %% "spark-core" % "2.0.0" % Compile,
      "org.apache.spark" %% "spark-sql" % "2.0.0" % Compile,
      "org.apache.spark" %% "spark-hive" % "2.0.0" % Compile,
      "org.scalatest" %% "scalatest" % "3.0.0" % Test,
      "com.holdenkarau" %% "spark-testing-base" % "2.0.0_0.4.5" % Test
    ),
    javaOptions ++= Seq("-Xms512M", "-Xmx2048M", "-XX:MaxPermSize=2048M", "-XX:+CMSClassUnloadingEnabled"),
    parallelExecution in Test := false
  )

The exception raised is

impossible to get artifacts when data has not been loaded. IvyNode = io.netty#netty;3.6.2.Final
java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = io.netty#netty;3.6.2.Final
        at org.apache.ivy.core.resolve.IvyNode.getArtifacts(IvyNode.java:809)
        at org.apache.ivy.core.resolve.IvyNode.getSelectedArtifacts(IvyNode.java:786)
        at org.apache.ivy.core.report.ResolveReport.setDependencies(ResolveReport.java:235)
        at org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:235)
        at org.apache.ivy.Ivy.resolve(Ivy.java:517)
        at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:279)
        at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:188)
        at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:165)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
        at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
        at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:132)
        at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
        at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
        at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
        at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
        at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
        at xsbt.boot.Using$.withResource(Using.scala:10)
        at xsbt.boot.Using$.apply(Using.scala:9)
        at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
        at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
        at xsbt.boot.Locks$.apply0(Locks.scala:31)
        at xsbt.boot.Locks$.apply(Locks.scala:28)
        at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
        at sbt.IvySbt.withIvy(Ivy.scala:127)
        at sbt.IvySbt.withIvy(Ivy.scala:124)
        at sbt.IvySbt$Module.withModule(Ivy.scala:155)
        at sbt.IvyActions$.updateEither(IvyActions.scala:165)
        at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1369)
        at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1365)
        at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$87.apply(Defaults.scala:1399)
        at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$87.apply(Defaults.scala:1397)
        at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37)
        at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1402)
        at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1396)
        at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60)
        at sbt.Classpaths$.cachedUpdate(Defaults.scala:1419)
        at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1348)
        at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1310)
        at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
        at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
        at sbt.std.Transform$$anon$4.work(System.scala:63)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
        at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
        at sbt.Execute.work(Execute.scala:235)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
        at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
        at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
@jaksmid
Copy link

jaksmid commented Oct 4, 2016

Hello, I had the similar issue in one of my projects. I resolved it by adding the dependency manually to the build.sbt file:

libraryDependencies += "io.netty" % "netty" % "3.6.2.Final"

Hope it helps!

@cdalzell
Copy link

I began receiving this error when I started using sbt-spark-package. Removing Spark libraries from libraryDependencies and letting sbt-spark-package manage them via sparkComponents solved the issue.

malcolmgreaves added a commit to malcolmgreaves/data-tc that referenced this issue Dec 13, 2016
@holdenk
Copy link
Owner

holdenk commented Jan 4, 2017

I believe this issue is resolved with the comments, if its not feel free to re-open.

@holdenk holdenk closed this as completed Jan 4, 2017
@nkumardemlai
Copy link

I am using the dependencies as below , but I am still getting this issue

lazy val versions = new {
val logging = "3.7.2"
val spark = "2.2.0"
val config = "1.3.1"
val logback = "1.1.7"
val mockito = "1.9.5"
val scalatest = "3.0.4"
val scalacheck = "1.13.4"
val specs2 = "2.4.17"
val aws = "1.11.170"
val nscala = "2.16.0"
val xml = "0.4.1"
}

libraryDependencies ++= Seq(
// provided is best for sbt assembly but fails in IDE
// "com.amazonaws" % "aws-java-sdk-s3" % versions.aws % "provided",
// "org.apache.spark" %% "spark-core" % versions.spark % "provided",
// "org.apache.spark" %% "spark-sql" % versions.spark % "provided",
// "org.apache.spark" %% "spark-streaming" % versions.spark % "provided",

"com.amazonaws" % "aws-java-sdk-s3" % versions.aws,
"org.apache.spark" %% "spark-core" % versions.spark,
"org.apache.spark" %% "spark-sql" % versions.spark,
"org.apache.spark" %% "spark-streaming" % versions.spark,

// will trigger SLF4J over LogBack
"ch.qos.logback" % "logback-classic" % versions.logback,
"com.databricks" %% "spark-xml" % versions.xml,
"com.github.nscala-time" %% "nscala-time" % versions.nscala,
"com.typesafe" % "config" % versions.config,
"com.typesafe.scala-logging" %% "scala-logging" % versions.logging,

"org.mockito" % "mockito-core" % versions.mockito % "test",
"org.scalacheck" %% "scalacheck" % versions.scalacheck % "test",
"org.scalatest" %% "scalatest" % versions.scalatest % "test",
"org.specs2" %% "specs2-mock" % versions.specs2 % "test",
"com.holdenkarau" % "spark-testing-base_2.11" % "2.2.0_0.7.4" % "test",
"org.apache.spark" % "spark-hive_2.10" % "2.2.0" % "test",
"io.netty" % "netty" % "3.6.2.Final"

I am suspecting , once after adding hive_2.10 this issue coming up.

@holdenk
Copy link
Owner

holdenk commented Oct 25, 2017

Can you paste the stack trace your seeing?

@nkumardemlai
Copy link

[info] Resolving jline#jline;2.14.3 ...
[error] impossible to get artifacts when data has not been loaded. IvyNode = commons-logging#commons-logging;1.2
java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = commons-logging#commons-logging;1.2
at org.apache.ivy.core.resolve.IvyNode.getArtifacts(IvyNode.java:809)
at org.apache.ivy.core.resolve.IvyNode.getSelectedArtifacts(IvyNode.java:786)
at org.apache.ivy.core.report.ResolveReport.setDependencies(ResolveReport.java:235)
at org.apache.ivy.core.resolve.ResolveEngine.resolve(ResolveEngine.java:235)
at org.apache.ivy.Ivy.resolve(Ivy.java:517)
at sbt.IvyActions$.sbt$IvyActions$$resolve(IvyActions.scala:282)
at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:191)
at sbt.IvyActions$$anonfun$updateEither$1.apply(IvyActions.scala:168)
at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
at sbt.IvySbt$Module$$anonfun$withModule$1.apply(Ivy.scala:155)
at sbt.IvySbt$$anonfun$withIvy$1.apply(Ivy.scala:132)
at sbt.IvySbt.sbt$IvySbt$$action$1(Ivy.scala:57)
at sbt.IvySbt$$anon$4.call(Ivy.scala:65)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
at xsbt.boot.Using$.withResource(Using.scala:10)
at xsbt.boot.Using$.apply(Using.scala:9)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at sbt.IvySbt.withDefaultLogger(Ivy.scala:65)
at sbt.IvySbt.withIvy(Ivy.scala:127)
at sbt.IvySbt.withIvy(Ivy.scala:124)
at sbt.IvySbt$Module.withModule(Ivy.scala:155)
at sbt.IvyActions$.updateEither(IvyActions.scala:168)
at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1392)
at sbt.Classpaths$$anonfun$sbt$Classpaths$$work$1$1.apply(Defaults.scala:1388)
at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1422)
at sbt.Classpaths$$anonfun$doWork$1$1$$anonfun$90.apply(Defaults.scala:1420)
at sbt.Tracked$$anonfun$lastOutput$1.apply(Tracked.scala:37)
at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1425)
at sbt.Classpaths$$anonfun$doWork$1$1.apply(Defaults.scala:1419)
at sbt.Tracked$$anonfun$inputChanged$1.apply(Tracked.scala:60)
at sbt.Classpaths$.cachedUpdate(Defaults.scala:1442)
at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1371)
at sbt.Classpaths$$anonfun$updateTask$1.apply(Defaults.scala:1325)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
[error] (*:update) java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = commons-logging#commons-logging;1.2
[error] Total time: 191 s, completed Oct 25, 2017 11:52:20 AM

@nkumardemlai
Copy link

if I remove/ disable Hive 2.10 from the dependencies, I am seeing below , stack trace

[error] Could not run test test: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
[error] Could not run test SampleDataFrameTest: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
[info] TestContext:
[info] TestContext *** ABORTED ***
[info] java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
[info] at com.holdenkarau.spark.testing.DataFrameSuiteBaseLike$class.newBuilder$1(DataFrameSuiteBase.scala:80)
[info] at com.holdenkarau.spark.testing.DataFrameSuiteBaseLike$class.sqlBeforeAllTestCases(DataFrameSuiteBase.scala:110)
[info] at TestContext.com$holdenkarau$spark$testing$DataFrameSuiteBase$$super$sqlBeforeAllTestCases(TestContext.scala:4)
[info] at com.holdenkarau.spark.testing.DataFrameSuiteBase$class.beforeAll(DataFrameSuiteBase.scala:42)
[info] at TestContext.beforeAll(TestContext.scala:4)
[info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
[info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
[info] at TestContext.run(TestContext.scala:4)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[info] ...
[trace] Stack trace suppressed: run last test:test for the full output.
[error] Could not run test TestContext: java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
[info] TestJsonDF:
[info] TestJsonDF *** ABORTED ***
[info] java.lang.NullPointerException:
[info] at TestJsonDF.sparkSession$lzycompute(TestJsonDF.scala:13)
[info] at TestJsonDF.sparkSession(TestJsonDF.scala:11)
[info] at TestJsonDF.(TestJsonDF.scala:19)
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
[info] at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
[info] at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
[info] at java.lang.reflect.Constructor.newInstance(Unknown Source)
[info] at java.lang.Class.newInstance(Unknown Source)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:435)
[info] at sbt.TestRunner.runTest$1(TestFramework.scala:76)

@holdenk
Copy link
Owner

holdenk commented Dec 20, 2017

What happens if you bump your sbt version (see sbt/sbt#2015 (comment) )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants