Permalink
Browse files

SPARK-1187, Added missing Python APIs

The following Python APIs are added,
RDD.id()
SparkContext.setJobGroup()
SparkContext.setLocalProperty()
SparkContext.getLocalProperty()
SparkContext.sparkUser()

was raised earlier as a part of  apache/incubator-spark#486

Author: Prabin Banka <prabin.banka@imaginea.com>

Closes #75 from prabinb/python-api-backup and squashes the following commits:

cc3c6cd [Prabin Banka] Added missing Python APIs
  • Loading branch information...
1 parent 3eb009f commit 3d3acef0474b6dc21f1b470ea96079a491e58b75 @prabinb prabinb committed with pwendell Mar 6, 2014
Showing with 38 additions and 0 deletions.
  1. +31 −0 python/pyspark/context.py
  2. +7 −0 python/pyspark/rdd.py
View
31 python/pyspark/context.py
@@ -372,6 +372,37 @@ def _getJavaStorageLevel(self, storageLevel):
return newStorageLevel(storageLevel.useDisk, storageLevel.useMemory,
storageLevel.deserialized, storageLevel.replication)
+ def setJobGroup(self, groupId, description):
+ """
+ Assigns a group ID to all the jobs started by this thread until the group ID is set to a
+ different value or cleared.
+
+ Often, a unit of execution in an application consists of multiple Spark actions or jobs.
+ Application programmers can use this method to group all those jobs together and give a
+ group description. Once set, the Spark web UI will associate such jobs with this group.
+ """
+ self._jsc.setJobGroup(groupId, description)
+
+ def setLocalProperty(self, key, value):
+ """
+ Set a local property that affects jobs submitted from this thread, such as the
+ Spark fair scheduler pool.
+ """
+ self._jsc.setLocalProperty(key, value)
+
+ def getLocalProperty(self, key):
+ """
+ Get a local property set in this thread, or null if it is missing. See
+ L{setLocalProperty}
+ """
+ return self._jsc.getLocalProperty(key)
+
+ def sparkUser(self):
+ """
+ Get SPARK_USER for user who is running SparkContext.
+ """
+ return self._jsc.sc().sparkUser()
+
def _test():
import atexit
import doctest
View
7 python/pyspark/rdd.py
@@ -95,6 +95,13 @@ def __init__(self, jrdd, ctx, jrdd_deserializer):
self.is_checkpointed = False
self.ctx = ctx
self._jrdd_deserializer = jrdd_deserializer
+ self._id = jrdd.id()
+
+ def id(self):
+ """
+ A unique ID for this RDD (within its SparkContext).
+ """
+ return self._id
def __repr__(self):
return self._jrdd.toString()

0 comments on commit 3d3acef

Please sign in to comment.