Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Building JAR procedure gives "neo4j-spark-connector not found" #31

Open
gianmariabiancofiore opened this issue Jun 3, 2021 · 2 comments

Comments

@gianmariabiancofiore
Copy link

gianmariabiancofiore commented Jun 3, 2021

Once started the "build.sbt" to build the JAR file, the module "neo4j-spark-connector 2.1.0-M4" is not found from the chosen repository. "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven" has no more that library, giving an unresolved dependency error. Are there some steps that can be done to solve this issue?

@jeffblackadar
Copy link

I replaced the .jar with protobuf-java-3.0.0.jar, but I have the same error.

[warn] Note: Unresolved dependencies path: [warn] neo4j-contrib:neo4j-spark-connector:2.1.0-M4 (/project/dstlr/build.sbt#L29-30) [warn] +- default:dstlr_2.11:0.1 [error] sbt.librarymanagement.ResolveException: unresolved dependency: neo4j-contrib#neo4j-spark-connector;2.1.0-M4: nod[

@odjhey
Copy link

odjhey commented Dec 29, 2022

im not really familiar with the scala/sbt ecosystem, but looks like one of the repository link specified in the build file is broken, fixed mine with below patch of build.sbt

 resolvers += "Restlet Repository" at "http://maven.restlet.org"
-resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
+resolvers += "Spark Packages Repo" at "https://repos.spark-packages.org"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants