Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use SparkSubmit to initialize JVM (maybe?) #94

Open
exyi opened this issue Jul 18, 2021 · 4 comments
Open

Use SparkSubmit to initialize JVM (maybe?) #94

exyi opened this issue Jul 18, 2021 · 4 comments

Comments

@exyi
Copy link
Contributor

exyi commented Jul 18, 2021

PySpark seems to start JVM using the spark-submit script: https://github.com/apache/spark/blob/master/python/pyspark/java_gateway.py#L63. That has some benefits, I'm specifically looking for an easy way to add dependencies using the spark.jars.packages config.

However, I don't know how they call java methods... I think Spark.jl could call the SparkSubmit.main method using jcall, which should lead to basically the same behavior, but the JVM will remain under Julia's control.

Honestly, I'm quite confused of how spark-submit works, maybe I'm just missing something obvious. I though it could be possible to execute Julia script using spark-submit after the dependencies are handled, but that also does not work :/

@dfdx
Copy link
Owner

dfdx commented Jul 18, 2021

So are you looking for a way to add custom JARs? If so, we have add_jar function for SparkContext, and there should be a similar way to add jars to SparkSession (you can call any Java methods using JavaCall.jcall()).

@exyi
Copy link
Contributor Author

exyi commented Jul 18, 2021

add_jar does not really cut it, the package has many dependencies and I'd really like Spark/maven to load them for me. I could find a method similar to addJar that would add packages :/

@dfdx
Copy link
Owner

dfdx commented Jul 18, 2021

You can try something like:

config = Dict("spark.jars.packages" => "...")
spark = SparkSession(..., config=config)

This should be equivalent to set this config via spark-submit.

@aviks
Copy link
Collaborator

aviks commented Aug 6, 2021

I looked at spark-submit a few years ago when I worked on this package, and it seemed too complicated -- I did not really understand how it worked. The way we load the JVM here seemed easier and more appropriate to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants