Skip to content

Conversation

@andrewor14
Copy link
Contributor

Problem. For bin/pyspark, there is currently no other way to specify Spark configuration properties other than through SPARK_JAVA_OPTS in conf/spark-env.sh. However, this mechanism is supposedly deprecated. Instead, it needs to pick up configurations explicitly specified in conf/spark-defaults.conf.

Solution. Have bin/pyspark invoke bin/spark-submit, like all of its counterparts in Scala land (i.e. bin/spark-shell, bin/run-example). This has the additional benefit of making the invocation of all the user facing Spark scripts consistent.


Details. bin/pyspark inherently handles two cases: (1) running python applications and (2) running the python shell. For (1), Spark submit already offers an existing code path to run python applications. For cases in which bin/pyspark is given a python file, we can simply call pass the file directly to spark-submit. This is the simple case:

  • bin/pyspark passes the python file to Spark submit
  • Spark submit passes the python file to PythonAppRunner
  • PythonAppRunner sets up the Py4j GatewayServer on the Java side
  • PythonAppRunner runs the python file as a sub-process

Case (2) is more involved. We cannot simply run the shell as another application, and use the existing code path in Spark submit as in (1). This is because the keyboard signals will not be propagated to the python interpreter properly, and dealing with each signal individually is cumbersome and likely not comprehensive. Thus, this PR takes the approach of making Python the parent process instead. This allows all keyboard signals to be propagated to the python REPL first, and then to the JVM:

  • bin/pyspark calls python/pyspark/repl.py
  • repl.py calls Spark submit as a sub-process
  • Spark submit calls PythonShellRunner
  • PythonShellRunner sets up the Py4j GatewayServer on
  • repl.py learns the Py4j gateway server port from PythonShellRunner through sockets
  • repl.py creates a SparkContext using this gateway server
  • repl.py starts a REPL with this SparkContext

TODO. Currently, the IPython case works only for the embedded shell, but not for the notebooks. We should make it work for all cases. Also, we need to update bin/pyspark.cmd as well so Windows doesn't get left behind.

Comments and feedback are most welcome.

The bin/pyspark script takes two pathways, depending on the application.

If the application is a python file, the script passes the python file
directly to Spark submit, which launches the python application as a
sub-process within the JVM.

If the application is the pyspark shell, the script invokes a special
python script that invokes Spark submit as a sub-process. The main
benefit here is that the Python is now the parent process (rather than
Scala), such that all keyboard signals are propagated to the python
interpreter properly.

This divergence of code path here means Spark submit needs to launch
two different kinds of python runners (in Scala). Currently, Spark
submit invokes the PythonRunner, which creates python subprocessses
to run python applications. However, this is not applicable to the
shell, because the parent process is already the python process that
runs the REPL. This is why PythonRunner is split into PythonAppRunner
(for launching applications) and PythonShellRunner (for launching
the pyspark shell).

The new bin/pyspark has been tested locally to run both the REPL and
python applications successfully through Spark submit. A big TODO at
this point is to make sure the IPython case is not affected.
@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15019/

@andrewor14
Copy link
Contributor Author

Git exception. Jenkins, test this please.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15023/

Previously, if bin/pyspark receives an argument, it unconditionally
interprets it as a python file. This is not correct.

As of this commit, all uses of bin/pyspark go through Spark submit
and passes the arguments on correctly.
@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15028/

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@andrewor14 andrewor14 changed the title [WIP] [SPARK-1808] Route bin/pyspark through Spark submit [SPARK-1808] Route bin/pyspark through Spark submit May 15, 2014
This does not apply to running a python application with bin/pyspark,
for instance.
@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15029/

@andrewor14 andrewor14 changed the title [SPARK-1808] Route bin/pyspark through Spark submit [WIP] [SPARK-1808] Route bin/pyspark through Spark submit May 15, 2014
@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/15030/

@andrewor14
Copy link
Contributor Author

Making big changes; re-opening in a bit.

@andrewor14 andrewor14 closed this May 15, 2014
@andrewor14 andrewor14 deleted the pyspark-submit branch May 15, 2014 23:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants