Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ZEPPELIN-4132]. Spark Interpreter has issue of SPARK-22393 #3353

Closed
wants to merge 1 commit into from

Conversation

zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Apr 25, 2019

What is this PR for?

This PR fix the issue of SPARK-22393 in zeppelin. We can fix this by using SparkIMain instead of IMain.

What type of PR is it?

[Bug Fix]

Todos

  • - Task

What is the Jira issue?

How should this be tested?

  • Unit test is added

Screenshots (if appropriate)

Questions:

  • Does the licenses files need update? No
  • Is there breaking changes for older versions? No
  • Does this needs documentation? No

@@ -43,7 +44,7 @@ class SparkScala211Interpreter(override val conf: SparkConf,

lazy override val LOGGER: Logger = LoggerFactory.getLogger(getClass)

private var sparkILoop: ILoop = _
private var sparkILoop: SparkILoop = _
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't we do this in all cases? not just Scala211Interpreter?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Scala210Interpreter already did that

@zjffdu
Copy link
Contributor Author

zjffdu commented May 2, 2019

Merge if no more comment

.travis.yml Outdated
@@ -109,7 +109,7 @@ matrix:
- sudo: required
jdk: "oraclejdk8"
dist: trusty
env: BUILD_PLUGINS="true" PYTHON="3" SCALA_VER="2.11" PROFILE="-Pspark-2.2 -Pscala-2.11 -Phadoop2 -Pintegration" SPARKR="true" BUILD_FLAG="install -DskipTests -DskipRat -am" TEST_FLAG="test -DskipRat -am" MODULES="-pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest22,SparkIntegrationTest22,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
env: BUILD_PLUGINS="true" PYTHON="3" SCALA_VER="2.11" PROFILE="-Pspark-2.2 -Pscala-2.10 -Phadoop2 -Pintegration" SPARKR="true" BUILD_FLAG="install -DskipTests -DskipRat -am" TEST_FLAG="test -DskipRat -am" MODULES="-pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest22,SparkIntegrationTest22,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May we need to change SCALA_VER as well?

@asfgit asfgit closed this in 1ca7039 May 17, 2019
asfgit pushed a commit that referenced this pull request May 17, 2019
This PR fix the issue of SPARK-22393 in zeppelin. We can fix this by using `SparkIMain` instead of `IMain`.

[Bug Fix]

* [ ] - Task

* https://jira.apache.org/jira/browse/ZEPPELIN-4132

* Unit test is added

* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jeff Zhang <zjffdu@apache.org>

Closes #3353 from zjffdu/ZEPPELIN-4132 and squashes the following commits:

c94b34a [Jeff Zhang] [ZEPPELIN-4132]. Spark Interpreter has issue of SPARK-22393

(cherry picked from commit 1ca7039)
Signed-off-by: Jeff Zhang <zjffdu@apache.org>
@alonshoham
Copy link

I am trying to upgrade Zeppelin to 0.8.2 and having problems with depInterpreter. I am loading jars with depInterpreter and using them as dependencies in a spark paragraph. The paragraph fails beacause it does not have the jars in its classpath. By setting the "zeppelin.spark.useNew" property to false the spark paragraph succeeds. Could this be related also to this issue resolved in 0.8.2?

@zjffdu
Copy link
Contributor Author

zjffdu commented Oct 30, 2019

@alonshoham new SparkInterpreter doesn't support dep interpreter. If you want to add thrid party libraries, you need to set spark property spark.jars or spark.jars.packages

@alonshoham
Copy link

alonshoham commented Oct 30, 2019

Thanks @zjffdu
Will the new spark interpreter support depInterprter in the future? I saw that there is an ignored test with a comment by you about this behavior. My application relies on the functionality of depInterprter, I create notebooks with code that I compile and load with the depInterpreter. What do you suggest that I do?

@zjffdu
Copy link
Contributor Author

zjffdu commented Oct 30, 2019

@alonshoham You can use generic configuration instead.

http://zeppelin.apache.org/docs/0.8.2/usage/interpreter/overview.html#generic-confinterpreter
e.g.

%spark.conf

spark.jars           your_local_jar
spark.jars.packages      your_needed_packages

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants