Skip to content
This repository has been archived by the owner on Dec 20, 2018. It is now read-only.

can't add avro_2.11:4.0.0 in spark-shell #264

Open
seekmayank opened this issue Jan 14, 2018 · 3 comments
Open

can't add avro_2.11:4.0.0 in spark-shell #264

seekmayank opened this issue Jan 14, 2018 · 3 comments

Comments

@seekmayank
Copy link

LM-SJC-11001988:~ mayangupta$ spark-shell --packages com.databricks:spark-avro_2.11:4.0.0
Ivy Default Cache set to: /Users/mayangupta/.ivy2/cache
The jars for the packages stored in: /Users/mayangupta/.ivy2/jars
:: loading settings :: url = jar:file:/Library/spark-2.2.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.databricks#spark-avro_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.databricks#spark-avro_2.11;4.0.0 in central
found org.slf4j#slf4j-api;1.7.5 in local-m2-cache
found org.apache.avro#avro;1.7.6 in central
found org.codehaus.jackson#jackson-core-asl;1.9.13 in spark-list
found org.codehaus.jackson#jackson-mapper-asl;1.9.13 in spark-list
found com.thoughtworks.paranamer#paranamer;2.3 in spark-list
found org.xerial.snappy#snappy-java;1.0.5 in spark-list
found org.apache.commons#commons-compress;1.4.1 in spark-list
found org.tukaani#xz;1.0 in spark-list
:: resolution report :: resolve 2016ms :: artifacts dl 14ms
:: modules in use:
com.databricks#spark-avro_2.11;4.0.0 from central in [default]
com.thoughtworks.paranamer#paranamer;2.3 from spark-list in [default]
org.apache.avro#avro;1.7.6 from central in [default]
org.apache.commons#commons-compress;1.4.1 from spark-list in [default]
org.codehaus.jackson#jackson-core-asl;1.9.13 from spark-list in [default]
org.codehaus.jackson#jackson-mapper-asl;1.9.13 from spark-list in [default]
org.slf4j#slf4j-api;1.7.5 from local-m2-cache in [default]
org.tukaani#xz;1.0 from spark-list in [default]
org.xerial.snappy#snappy-java;1.0.5 from spark-list in [default]
:: evicted modules:
org.slf4j#slf4j-api;1.6.4 by [org.slf4j#slf4j-api;1.7.5] in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 10 | 1 | 1 | 1 || 9 | 0 |
---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
[NOT FOUND ] org.slf4j#slf4j-api;1.7.5!slf4j-api.jar (0ms)

==== local-m2-cache: tried

  file:/Users/mayangupta/.m2/repository/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar

	::::::::::::::::::::::::::::::::::::::::::::::

	::              FAILED DOWNLOADS            ::

	:: ^ see resolution messages for details  ^ ::

	::::::::::::::::::::::::::::::::::::::::::::::

	:: org.slf4j#slf4j-api;1.7.5!slf4j-api.jar

	::::::::::::::::::::::::::::::::::::::::::::::

:::: ERRORS
unknown resolver null

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [download failed: org.slf4j#slf4j-api;1.7.5!slf4j-api.jar]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1197)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:304)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
LM-SJC-11001988:~ mayangupta$

@ghost
Copy link

ghost commented Jan 15, 2018

This looks like a potential caching issue ? Try clearing out the local cache and running again. The command works well for me with spark 2.2.1

@oliveratutexas
Copy link

oliveratutexas commented Mar 29, 2018

I have the same issue after clearing out the local cache (removing .ivy2 and .m2)..
Also using 2.2.1

@SamAchten
Copy link

SamAchten commented Oct 11, 2018

I'm having the same issue on version 4.0.0, also deleted the .ivy2 and .m2 folders.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants