New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ZEPPELIN-4132]. Spark Interpreter has issue of SPARK-22393 #3353
Conversation
@@ -43,7 +44,7 @@ class SparkScala211Interpreter(override val conf: SparkConf, | |||
|
|||
lazy override val LOGGER: Logger = LoggerFactory.getLogger(getClass) | |||
|
|||
private var sparkILoop: ILoop = _ | |||
private var sparkILoop: SparkILoop = _ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't we do this in all cases? not just Scala211Interpreter?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Scala210Interpreter already did that
Merge if no more comment |
.travis.yml
Outdated
@@ -109,7 +109,7 @@ matrix: | |||
- sudo: required | |||
jdk: "oraclejdk8" | |||
dist: trusty | |||
env: BUILD_PLUGINS="true" PYTHON="3" SCALA_VER="2.11" PROFILE="-Pspark-2.2 -Pscala-2.11 -Phadoop2 -Pintegration" SPARKR="true" BUILD_FLAG="install -DskipTests -DskipRat -am" TEST_FLAG="test -DskipRat -am" MODULES="-pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest22,SparkIntegrationTest22,org.apache.zeppelin.spark.* -DfailIfNoTests=false" | |||
env: BUILD_PLUGINS="true" PYTHON="3" SCALA_VER="2.11" PROFILE="-Pspark-2.2 -Pscala-2.10 -Phadoop2 -Pintegration" SPARKR="true" BUILD_FLAG="install -DskipTests -DskipRat -am" TEST_FLAG="test -DskipRat -am" MODULES="-pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest22,SparkIntegrationTest22,org.apache.zeppelin.spark.* -DfailIfNoTests=false" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May we need to change SCALA_VER
as well?
This PR fix the issue of SPARK-22393 in zeppelin. We can fix this by using `SparkIMain` instead of `IMain`. [Bug Fix] * [ ] - Task * https://jira.apache.org/jira/browse/ZEPPELIN-4132 * Unit test is added * Does the licenses files need update? No * Is there breaking changes for older versions? No * Does this needs documentation? No Author: Jeff Zhang <zjffdu@apache.org> Closes #3353 from zjffdu/ZEPPELIN-4132 and squashes the following commits: c94b34a [Jeff Zhang] [ZEPPELIN-4132]. Spark Interpreter has issue of SPARK-22393 (cherry picked from commit 1ca7039) Signed-off-by: Jeff Zhang <zjffdu@apache.org>
I am trying to upgrade Zeppelin to 0.8.2 and having problems with depInterpreter. I am loading jars with depInterpreter and using them as dependencies in a spark paragraph. The paragraph fails beacause it does not have the jars in its classpath. By setting the "zeppelin.spark.useNew" property to false the spark paragraph succeeds. Could this be related also to this issue resolved in 0.8.2? |
@alonshoham new SparkInterpreter doesn't support dep interpreter. If you want to add thrid party libraries, you need to set spark property |
Thanks @zjffdu |
@alonshoham You can use generic configuration instead. http://zeppelin.apache.org/docs/0.8.2/usage/interpreter/overview.html#generic-confinterpreter
|
What is this PR for?
This PR fix the issue of SPARK-22393 in zeppelin. We can fix this by using
SparkIMain
instead ofIMain
.What type of PR is it?
[Bug Fix]
Todos
What is the Jira issue?
How should this be tested?
Screenshots (if appropriate)
Questions: