Skip to content

Commit

Permalink
[SPARK-21125][PYTHON] Extend setJobDescription to PySpark and JavaSpa…
Browse files Browse the repository at this point in the history
…rk APIs

## What changes were proposed in this pull request?

Extend setJobDescription to PySpark and JavaSpark APIs

SPARK-21125

## How was this patch tested?

Testing was done by running a local Spark shell on the built UI. I originally had added a unit test but the PySpark context cannot easily access the Scala Spark Context's private variable with the Job Description key so I omitted the test, due to the simplicity of this addition.

Also ran the existing tests.

# Misc

This contribution is my original work and that I license the work to the project under the project's open source license.

Author: sjarvie <sjarvie@uber.com>

Closes #18332 from sjarvie/add_python_set_job_description.
  • Loading branch information
sjarvie authored and ueshin committed Jun 21, 2017
1 parent 7a00c65 commit ba78514
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 0 deletions.
Expand Up @@ -757,6 +757,12 @@ class JavaSparkContext(val sc: SparkContext)
*/
def getLocalProperty(key: String): String = sc.getLocalProperty(key)

/**
* Set a human readable description of the current job.
* @since 2.3.0
*/
def setJobDescription(value: String): Unit = sc.setJobDescription(value)

/** Control our logLevel. This overrides any user-defined log settings.
* @param logLevel The desired log level as a string.
* Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
Expand Down
6 changes: 6 additions & 0 deletions python/pyspark/context.py
Expand Up @@ -942,6 +942,12 @@ def getLocalProperty(self, key):
"""
return self._jsc.getLocalProperty(key)

def setJobDescription(self, value):
"""
Set a human readable description of the current job.
"""
self._jsc.setJobDescription(value)

def sparkUser(self):
"""
Get SPARK_USER for user who is running SparkContext.
Expand Down

0 comments on commit ba78514

Please sign in to comment.