You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
:: problems summary ::
:::: WARNINGS
module not found: org.apache.hadoop#hadoop-core;0.20.2-cdh3u4
==== local-m2-cache: tried
file:/root/.m2/repository/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.pom
-- artifact org.apache.hadoop#hadoop-core;0.20.2-cdh3u4!hadoop-core.jar:
file:/root/.m2/repository/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.jar
==== local-ivy-cache: tried
/root/.ivy2/local/org.apache.hadoop/hadoop-core/0.20.2-cdh3u4/ivys/ivy.xml
-- artifact org.apache.hadoop#hadoop-core;0.20.2-cdh3u4!hadoop-core.jar:
/root/.ivy2/local/org.apache.hadoop/hadoop-core/0.20.2-cdh3u4/jars/hadoop-core.jar
==== central: tried
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.pom
-- artifact org.apache.hadoop#hadoop-core;0.20.2-cdh3u4!hadoop-core.jar:
https://repo1.maven.org/maven2/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.jar
==== spark-packages: tried
http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.pom
-- artifact org.apache.hadoop#hadoop-core;0.20.2-cdh3u4!hadoop-core.jar:
http://dl.bintray.com/spark-packages/maven/org/apache/hadoop/hadoop-core/0.20.2-cdh3u4/hadoop-core-0.20.2-cdh3u4.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.apache.hadoop#hadoop-core;0.20.2-cdh3u4: not found
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.hadoop#hadoop-core;0.20.2-cdh3u4: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1083)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:296)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:160)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I'm working on a 0.11.0 docker build, but ran into this. @ianmilligan1@lintool you fine with me cutting a 0.11.1 release which resolved the issue?
N.B. At this point I'd prefer to build the Docker image with --packages as opposed to --jars because it is surfacing a lot of dependency issues I've feared have remained hidden for a long time.
The text was updated successfully, but these errors were encountered:
I'm working on a 0.11.0 docker build, but ran into this. @ianmilligan1 @lintool you fine with me cutting a 0.11.1 release which resolved the issue?
N.B. At this point I'd prefer to build the Docker image with
--packages
as opposed to--jars
because it is surfacing a lot of dependency issues I've feared have remained hidden for a long time.The text was updated successfully, but these errors were encountered: