Skip to content

Commit

Permalink
[SPARK-31322][SQL] rename QueryPlan.collectInPlanAndSubqueries to col…
Browse files Browse the repository at this point in the history
…lectWithSubqueries

### What changes were proposed in this pull request?

rename `QueryPlan.collectInPlanAndSubqueries` to `collectWithSubqueries`

### Why are the changes needed?

The old name is too verbose. `QueryPlan` is internal but it's the core of catalyst and we'd better make the API name clearer before we release it.

### Does this PR introduce any user-facing change?

no

### How was this patch tested?

N/A

Closes apache#28092 from cloud-fan/rename.

Authored-by: Wenchen Fan <wenchen@databricks.com>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
cloud-fan authored and sjincho committed Apr 14, 2020
1 parent d60fc70 commit 39c1f69
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 5 deletions.
Expand Up @@ -232,10 +232,10 @@ abstract class QueryPlan[PlanType <: QueryPlan[PlanType]] extends TreeNode[PlanT
}

/**
* Returns a sequence containing the result of applying a partial function to all elements in this
* A variant of `collect`. This method not only apply the given function to all elements in this
* plan, also considering all the plans in its (nested) subqueries
*/
def collectInPlanAndSubqueries[B](f: PartialFunction[PlanType, B]): Seq[B] =
def collectWithSubqueries[B](f: PartialFunction[PlanType, B]): Seq[B] =
(this +: subqueriesAll).flatMap(_.collect(f))

override def innerChildren: Seq[QueryPlan[_]] = subqueries
Expand Down
Expand Up @@ -78,7 +78,7 @@ class QueryPlanSuite extends SparkFunSuite {

val countRelationsInPlan = plan.collect({ case _: UnresolvedRelation => 1 }).sum
val countRelationsInPlanAndSubqueries =
plan.collectInPlanAndSubqueries({ case _: UnresolvedRelation => 1 }).sum
plan.collectWithSubqueries({ case _: UnresolvedRelation => 1 }).sum

assert(countRelationsInPlan == 2)
assert(countRelationsInPlanAndSubqueries == 5)
Expand Down
Expand Up @@ -87,7 +87,7 @@ object CollectMetricsExec {
* Recursively collect all collected metrics from a query tree.
*/
def collect(plan: SparkPlan): Map[String, Row] = {
val metrics = plan.collectInPlanAndSubqueries {
val metrics = plan.collectWithSubqueries {
case collector: CollectMetricsExec => collector.name -> collector.collectedMetrics
}
metrics.toMap
Expand Down
Expand Up @@ -1234,7 +1234,7 @@ abstract class DynamicPartitionPruningSuiteBase

val plan = df.queryExecution.executedPlan
val countSubqueryBroadcasts =
plan.collectInPlanAndSubqueries({ case _: SubqueryBroadcastExec => 1 }).sum
plan.collectWithSubqueries({ case _: SubqueryBroadcastExec => 1 }).sum

assert(countSubqueryBroadcasts == 2)
}
Expand Down

0 comments on commit 39c1f69

Please sign in to comment.