Skip to content

Commit

Permalink
Scope all RDD methods
Browse files Browse the repository at this point in the history
This commit provides a mechanism to set and unset the call scope
around each RDD operation defined in RDD.scala. This is useful
for tagging an RDD with the scope in which it is created. This
will be extended to similar methods in SparkContext.scala and
other relevant files in a future commit.
  • Loading branch information
Andrew Or committed Apr 17, 2015
1 parent 55f553a commit 6b3403b
Show file tree
Hide file tree
Showing 2 changed files with 208 additions and 107 deletions.
4 changes: 2 additions & 2 deletions core/src/main/scala/org/apache/spark/SparkContext.scala
Original file line number Diff line number Diff line change
Expand Up @@ -1983,10 +1983,10 @@ object SparkContext extends Logging {
}

private[spark] val SPARK_JOB_DESCRIPTION = "spark.job.description"

private[spark] val SPARK_JOB_GROUP_ID = "spark.jobGroup.id"

private[spark] val SPARK_JOB_INTERRUPT_ON_CANCEL = "spark.job.interruptOnCancel"
private[spark] val RDD_SCOPE_KEY = "spark.rdd.scope"
private[spark] val RDD_SCOPE_NO_OVERRIDE_KEY = "spark.rdd.scope.noOverride"

/**
* Executor id for the driver. In earlier versions of Spark, this was `<driver>`, but this was
Expand Down
Loading

0 comments on commit 6b3403b

Please sign in to comment.