New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark 1.2 #294
Spark 1.2 #294
Conversation
Don't worry, they are just slow. |
(update: I got it...) from the spark source https://github.com/apache/spark/commits/branch-1.2? It doesn't seem like SPARK-4923 is ported to Spark branch-1.2 yet apache/spark@d05c9ee |
I couldn't get maven to look at the publish by sbt. is there a way to do that? |
What i did is
I think spark 1.2.0 supposed to skip publish spark-repl artifact to public maven repository until SPARK-4923 resolved, but i think it does still deploy spark-repl to local m2 repository. |
Thanks, that works after a clean. I'm getting this build error. any idea? [INFO] --- frontend-maven-plugin:0.0.16:install-node-and-npm (install node and npm) @ zeppelin-web --- |
@felixcheung , About the build error on Zeppelin-web, I am working on a fix ( PR #296 ). I guess it will be ready really soon. |
I merged with your branch in that PR and now I'm seeing this: [INFO] bower angular#1.3.8 cached git://github.com/angular/bower-angular.git#1.3.8 ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:0.0.20:bower (bower install) on project zeppelin-web: Failed to run task: 'bower --allow-root install' failed. (error code 1) -> [Help 1] |
I'm building with Really trying to get that to work :) |
@felixcheung Did you delete bower_components and node_modules folders before compiling? |
|
Well just to be sure
|
It works! thanks for jumping in to help [INFO] Zeppelin ........................................... SUCCESS [ 4.857 s] |
Glad to see thats finally worked! |
When this PR merge to |
I think after spark-repl 1.3.0 (or at least 1.2.1) is released? On 01/29/2015 07:11 PM, Jongyoul Lee wrote:
|
I agree that It's a little difficult to build it. How about setting spark 1.2 as an option? Default setting is 1.1 and 1.2 is just an option. While master changes so fast, but because I'm using spark 1.2, I cannot use new feature. |
This might be a separate topic but I couldn't get Spark 1.2 profile to work with MASTER=yarn-client. From the exception below it seems to indicate the class cannot be loaded, but this class should be in spark-core. The spark jars are in zeppelin/interpreter/spark and I couldn't get it to load from somewhere else (I've set SPARK_HOME to /usr/lib/spark) Any idea? ------ Create new SparkContext yarn-client ------- |
@felixcheung You'd better set /path/to/hadoop/conf/dir to CLASSPATH=/path/to/hadoop/conf ./zeppelin-daemon.sh start |
@jongyoul thanks, shouldn't class path be pointing to lib path instead of conf? |
If you set CLASSPATH before you start Zeppelin, It's included by java.lib.path. I don't think it's best way to include hadoop classpath but we don't have any other way to do this. I hope we can set HADOOP_CONF_DIR as well. |
Thanks, but unfortunately it didn't work: $ CLASSPATH=/etc/hadoop/conf ./zeppelin-0.5.0-SNAPSHOT/bin/zeppelin-daemon.sh start Exception in thread "Thread-18" java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.deploy.SparkHadoopUtil$ |
@felixcheung Yes, you're right. I've tested with those argument, and same error occurs. I've added spark-yarn package, but I also met another error. I'll do another thing |
I think it's good to merge this PR, while maintaining spark 1.1 as a default and spark 1.2 as option, |
An easier solution is to follow these instructions (and / or use the precompiled binary) https://gist.github.com/kmader/3394153de1154cb18475 |
Still not working , I am doing [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.2.1:exec (grunt-build-to-src-main-webapp) on project zeppelin-web: Command execution failed. Process exited with an error: 6 (Exit value: 6) -> [Help 1] |
Spark repl 1.2.1 is now available in maven central so I guess everyone should be able to build zeppeling against spark 1.2.X |
@ayoub-benali OMG, thanks for sharing. |
Good! late but still good :) On 02/09/2015 11:01 AM, Kevin (Sangwoo) Kim wrote:
|
They don't have it for 1.2.0. The plan was to back port it, I guess that hasn't been completed yet. I'm on 1.2.0 and can't move to 1.2.1 easily. --- Original Message --- From: "Ayoub Benali" notifications@github.com Spark repl 1.2.1 is now available in maven central so I guess everyone should be able to build zeppeling against spark 1.2.X Reply to this email directly or view it on GitHub: |
Yeah , I was successful in building the zeppelin-master with spark 1.2 , but getting exception while creating sql context. I updated the pom to get appropriate netty jar (4.0.17.Final) Exception in thread "Thread-41" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.(ZIIIIIII)V Can someone please help me here ? Regards, |
@felixcheung spark 1.2.1 is just for bug fixes just like the other minor version. So there should be no breaking changes. Plus I am not sure that they will publish the repl for spark 1.2.0 |
I think you have a mismatch version of Netty. I had removed the version element for io.netty in the pom file under spark (ie. Spark interpreter) and that worked for me. Also, I had to add spark-catalyst to dependency. Hope this helps. --- Original Message --- From: "gtinside" notifications@github.com Yeah , I was successful in building the zeppelin-master with spark 1.2 , but getting exception while creating sql context. I updated the pom to get appropriate netty jar (4.0.17.Final) Exception in thread "Thread-41" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.(ZIIIIIII)V Can someone please help me here ? Regards, Reply to this email directly or view it on GitHub: |
Thanks. I'm aware. I have clusters on 1.2.0 and they are out of my control. I was running 1.2 before it was released on my Dev box ;) --- Original Message --- From: "Ayoub Benali" notifications@github.com @felixcheung spark 1.2.1 is just for bug fixes just like the other minor version. So there should be no breaking changes. Plus I am not sure that they will publish the repl for spark 1.2.0 Reply to this email directly or view it on GitHub: |
Guys, you would know Spark 1.2.1 (bug fixes, stability) is just released and recommended for all 1.2.0 users to upgrade to 1.2.1. Go for it! |
I'm trying with 1.2.1, It works. But if i run it on eclipse IDE (my development environment), i got exception with rdd.registerTempTable()
related issue It's fine with |
@@ -100,9 +102,12 @@ | |||
<codahale.metrics.version>3.0.0</codahale.metrics.version> | |||
<avro.version>1.7.6</avro.version> | |||
<jets3t.version>0.7.1</jets3t.version> | |||
<commons.math3.version>3.2</commons.math3.version> | |||
<commons.httpclient.version>4.3.6</commons.httpclient.version> | |||
<io.netty.version>4.0.17.Final</io.netty.version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
update please version of netty to latest version 4.0.25.Final
because spark 1.2.0+ use PooledByteBufAllocator with 8 parameters, which introduced in 4.0.18.Final
without update netty we have
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.(ZIIIIIII)V
in spark.version=1.2.0-cdh5.3.1when create SparkContext
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@xhumanoid
Thanks for reporting problem. Zeppelin enables spark-1.2.0+ support by setting maven profile 'spark-1.2'. Profile 'spark-1.2' overrides io.netty.version to '4.0.23.Final' (line 1223). So, i think it's okay with 'spark-1.2' profile.
Since spark-repl-1.2.1 artifact is published, it's ready to be merged. |
Let's merge it. On Wed Feb 11 2015 at 4:48:04 PM Lee moon soo notifications@github.com
|
+1 @Leemoonsoo |
Guys, it seems there is a critical issue with spark 1.2 + Zeppelin. https://github.com/NFLabs/zeppelin/issues/339 |
This PR fixes ZEPPELIN-298 by searching correct "addListener" method which have JobProgressListener as a parameter. Author: Lee moon soo <moon@apache.org> Closes ZEPL#294 from Leemoonsoo/ZEPPELIN-298 and squashes the following commits: 6fb2397 [Lee moon soo] Fix job progress listener registration
Trying to Zeppelin work with Spark 1.2
How to test this branch
Because of spark-repl artifact is not yet released for 1.2 version, need to build and deploy locally, before build Zeppelin.
To do that, clone spark 1.2 source and run
Here's simple test i did in local mode.