Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark Enrich: downgrade geoip2 to 2.5.0 #3702

Closed
BenFradet opened this issue Mar 28, 2018 · 7 comments
Closed

Spark Enrich: downgrade geoip2 to 2.5.0 #3702

BenFradet opened this issue Mar 28, 2018 · 7 comments
Assignees

Comments

@BenFradet
Copy link
Contributor

sister ticket to #3701, this time it's Spark which uses 2.6.7

@BenFradet
Copy link
Contributor Author

We can't use 2.9.3 because there was no release for jackson-module-scala.

@BenFradet BenFradet changed the title Spark Enrich: force jackson to 2.9.3 Spark Enrich: force jackson to 2.9.4 Mar 28, 2018
@BenFradet BenFradet changed the title Spark Enrich: force jackson to 2.9.4 Spark Enrich: downgrade geoip2 to 2.5.0 Mar 30, 2018
@BenFradet
Copy link
Contributor Author

  • shading resulted in conflicts with json4s (java.lang.NoSuchMethodError: org.json4s.jackson.JsonMethods$.fromJsonNode(Lshadedjackson/databind/JsonNode;)Lorg/json4s/JsonAST$JValue;
  • no shading resulted in conflicts with maxmind (java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.node.ArrayNode.<init>(Lcom/fasterxml/jackson/databind/node/JsonNodeFactory;Ljava/util/List;)V)
  • downgrading geoip2 to 2.5.0 which depends on jackson 2.6.7 worked

@weipe007
Copy link

weipe007 commented May 3, 2018

Hi @BenFradet @asgergb, as this issue is not fixed yet. Is there any work around?
We are using Spark 2.1.0 and snowplow maxmind scala lib 0.2.0 before. But after we updated to 0.4.0, it always complain below error when I run either unit test or in Spark.

Downgrade geoip2 to 2.5.0 or shading the jar will not help on this. Could you help to advise?

Thanks
Martin

18/05/03 03:55:27 WARN KafkaUtils: overriding receive.buffer.bytes to 65536 see KAFKA-3135
18/05/03 03:55:32 WARN TaskSetManager: Lost task 0.0 in stage 6.0 (TID 114, 10.128.96.40, executor 0): java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.node.ArrayNode.(Lcom/fasterxml/jackson/databind/node/JsonNodeFactory;Ljava/util/List;)V
at com.maxmind.db.Decoder.decodeArray(Decoder.java:272)
at com.maxmind.db.Decoder.decodeByType(Decoder.java:156)
at com.maxmind.db.Decoder.decode(Decoder.java:147)
at com.maxmind.db.Decoder.decodeMap(Decoder.java:281)
at com.maxmind.db.Decoder.decodeByType(Decoder.java:154)
at com.maxmind.db.Decoder.decode(Decoder.java:147)
at com.maxmind.db.Decoder.decode(Decoder.java:87)
at com.maxmind.db.Reader.(Reader.java:132)
at com.maxmind.db.Reader.(Reader.java:116)
at com.maxmind.geoip2.DatabaseReader.(DatabaseReader.java:66)

@BenFradet
Copy link
Contributor Author

Hey @weipe007 , how are you building Spark Enrich because it depends on geoip2 0.5.0 which in turns depends on jackson-databind 2.6.4. Spark, depends on jackson-databind 2.6.7 which is binary compatible with 2.6.4.

@weipe007
Copy link

weipe007 commented May 3, 2018

Thanks @BenFradet ! Do you have any hard version requirement in snowplow scala-maxmind-lookups 0.4?

I kept on seeing it need jackson 2.9.3 when run the unit test.

My build.sbt(0.13) is like this:

==========================================
scalaVersion := "2.11.6"

resolvers += "SnowPlow Repo" at "http://maven.snplow.com/releases/"
libraryDependencies <++= libraryVersions { version => Seq (
"com.typesafe" % "config" % version('typesafe),
"org.scalatest" % "scalatest_2.11" % "2.2.2" % "test",
"com.fasterxml.jackson.core" % "jackson-core" % "2.6.7",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.6.7",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.7",
"com.maxmind.geoip2" % "geoip2" % "2.5.0",
"com.snowplowanalytics" %% "scala-maxmind-iplookups" % "0.4.0",
"commons-io" % "commons-io" % "2.5"
)}

@BenFradet
Copy link
Contributor Author

It transitively gets 2.9.3 but can be overriden, that's what we do in spark-enrich and it works fine.

I would advise comparing what gets into your assembly and from where by running sbt-dependency-graph.

@weipe007
Copy link

weipe007 commented May 8, 2018

Thanks @BenFradet! After checking using sbt-dependency-graph, I was able to get it work. Just adding below to build.sbt is fine for me.

dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.6.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.6.7"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-annotations" % "2.6.7"
dependencyOverrides += "com.maxmind.geoip2" % "geoip2"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants