Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sparkling Water on CDH 5.9's Spark 1.6 #127

Closed
instanceofme opened this issue Nov 7, 2016 · 6 comments
Closed

Sparkling Water on CDH 5.9's Spark 1.6 #127

instanceofme opened this issue Nov 7, 2016 · 6 comments
Labels

Comments

@instanceofme
Copy link

Hi,

On Sparkling Water 1.6.8 & CDH 5.9's Spark 1.6, when creating a H2OContext, I get the folowing, which I could not reproduce with other versions of CDH or Spark:

java.lang.AbstractMethodError
     at org.apache.spark.Logging$class.log(Logging.scala:50)
     at org.apache.spark.h2o.backends.internal.InternalH2OBackend.log(InternalH2OBackend.scala:31)
     at org.apache.spark.Logging$class.logWarning(Logging.scala:70)
     at org.apache.spark.h2o.backends.internal.InternalH2OBackend.logWarning(InternalH2OBackend.scala:31)
     at org.apache.spark.h2o.backends.SharedBackendUtils$class.checkAndUpdateConf(SharedBackendUtils.scala:54)
     at org.apache.spark.h2o.backends.internal.InternalH2OBackend.checkAndUpdateConf(InternalH2OBackend.scala:40)
     at org.apache.spark.h2o.H2OContext.<init>(H2OContext.scala:83)
     at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:262)
     at org.apache.spark.h2o.H2OContext$.getOrCreate(H2OContext.scala:277)

This reminds me of SW-176, where CDH 5.7's Spark 1.6.0 had an additional method on some class (compared to Spark's official source code).

@jiaqizho
Copy link

same problem

@jakubhava
Copy link
Contributor

Sorry for very delayed response. Thanks for the bug report. We'll have a look on it and let you know 👍

@edgararuiz-zz
Copy link

Hi @jakubhava, any updates on this bug?

@mdymczyk
Copy link
Contributor

mdymczyk commented Jan 23, 2017

Hey guys, sorry for the delay - I had a look at this issue and it seems like the Spark code used in CDH 5.9 changes some some internals in org.apache.spark.Logging. Because of this apps compiled against vanilla Spark 1.6 won't work... I'm working on a fix but don't have an ETA yet, sorry.

@mdymczyk
Copy link
Contributor

Fix pushed, waiting to be merged into rel-1.6 and get released. If anyone want to test please clone MD_cdh5.9.1_fix branch and build (or dist) it.

@jakubhava
Copy link
Contributor

The fix is in the latest release for Spark 1.6

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants