Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-23564][SQL] Add isNotNull check for left anti and outer joins #20717

Closed
wants to merge 4 commits into from

Conversation

mgaido91
Copy link
Contributor

@mgaido91 mgaido91 commented Mar 2, 2018

What changes were proposed in this pull request?

In order to optimize queries, some conditions can be added to the join condition for LEFT ANTI and OUTER joins. Unfortunately, so far this was not done since we are using only constraints which can be enforced on the output of the operator (in this case of the JOIN).

We can enforce some isNotNull conditions on one side, which are not valid conditions on the output of the Join, though. The PR adds these conditions in the Optimizer phase, in order to improve performance in some cases.

How was this patch tested?

Added UTs

Please review http://spark.apache.org/contributing.html before opening a pull request.

@SparkQA
Copy link

SparkQA commented Mar 2, 2018

Test build #87890 has finished for PR 20717 at commit 45fbb85.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • trait NotNullConstraintHelper
  • trait QueryPlanConstraints extends NotNullConstraintHelper

@mgaido91
Copy link
Contributor Author

mgaido91 commented Mar 8, 2018


/**
* Returns additional constraints which are not enforced on the result of join operations, but
* which can be enforced either on the left or the right side
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not put this in Join.validConstraints? LogicalPlan.constraints should only contain constraints for the plab output, but LogicalPlan.allConstraints can contain more.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't put it there, because constraints is created from allConstraints, so adding them to validConstraints could have caused them to be part of constraints too.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah i see the problem. For left-anti join, although Join.output reuse the attributes from left child output, they are actually different attributes, e.g. Join may output null values, so we can't generate these constraints in Join.validConstraints.

I think we can override both allConstraints and constraints, to make sure these extra constraints appear in allConstraints, but not constraints.

@SparkQA
Copy link

SparkQA commented Mar 10, 2018

Test build #88148 has finished for PR 20717 at commit d8a1190.

  • This patch fails Scala style tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • trait QueryPlanConstraints

@SparkQA
Copy link

SparkQA commented Mar 10, 2018

Test build #88149 has finished for PR 20717 at commit 9e2d993.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@mgaido91
Copy link
Contributor Author

any more comments @cloud-fan ?

@@ -91,7 +97,7 @@ trait QueryPlanConstraints { self: LogicalPlan =>
* Recursively explores the expressions which are null intolerant and returns all attributes
* in these expressions.
*/
private def scanNullIntolerantAttribute(expr: Expression): Seq[Attribute] = expr match {
protected def scanNullIntolerantAttribute(expr: Expression): Seq[Attribute] = expr match {
Copy link
Member

@dongjoon-hyun dongjoon-hyun Mar 22, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let keep this private because this is used only in this class.

override lazy val constraints: ExpressionSet = ExpressionSet(
super.constructAllConstraints.filter { c =>
c.references.nonEmpty && c.references.subsetOf(outputSet) && c.deterministic
})
Copy link
Member

@dongjoon-hyun dongjoon-hyun Mar 22, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add more test cases (or statements) for this code path?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks, I added some statements to the ConstraintPropagationSuite.

val correctAnswer = left.join(right, RightOuter, condition).analyze
val optimized = Optimize.execute(originalQuery)
comparePlans(optimized, correctAnswer)
}
Copy link
Member

@dongjoon-hyun dongjoon-hyun Mar 22, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this is a simple repetition of the previous test("SPARK-23405: left-semi equal-join should filter out null join keys on both sides", what about making helper test function and simplify these together at this time? Something like the following?

  private def testConstraints(
      x: LogicalPlan, y: LogicalPlan, left: LogicalPlan, right: LogicalPlan, joinType: JoinType) = {
    val condition = Some("x.a".attr === "y.a".attr)
    val originalQuery = x.join(y, joinType, condition).analyze
    val correctAnswer = left.join(right, joinType, condition).analyze
    val optimized = Optimize.execute(originalQuery)
    comparePlans(optimized, correctAnswer)
  }

  test("SPARK-23405: left-semi equal-join should filter out null join keys on both sides") {
    val x = testRelation.subquery('x)
    val y = testRelation.subquery('y)
    testConstraints(x, y, x.where(IsNotNull('a)), y.where(IsNotNull('a)), LeftSemi)
  }

  test("SPARK-23564: left anti join should filter out null join keys on right side") {
    val x = testRelation.subquery('x)
    val y = testRelation.subquery('y)
    testConstraints(x, y, x, y.where(IsNotNull('a)), LeftAnti)
  }

  test("SPARK-23564: left outer join should filter out null join keys on right side") {
    val x = testRelation.subquery('x)
    val y = testRelation.subquery('y)
    testConstraints(x, y, x, y.where(IsNotNull('a)), LeftOuter)
  }

  test("SPARK-23564: right outer join should filter out null join keys on left side") {
    val x = testRelation.subquery('x)
    val y = testRelation.subquery('y)
    testConstraints(x, y, x.where(IsNotNull('a)), y, RightOuter)
  }

@SparkQA
Copy link

SparkQA commented Mar 23, 2018

Test build #88542 has finished for PR 20717 at commit 5cadd86.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@mgaido91
Copy link
Contributor Author

mgaido91 commented Apr 2, 2018

any more comments @cloud-fan @dongjoon-hyun ?

@mgaido91
Copy link
Contributor Author

kindly ping @cloud-fan @dongjoon-hyun

ghost pushed a commit to dbtsai/spark that referenced this pull request Apr 23, 2018
…'s children

## What changes were proposed in this pull request?

The existing query constraints framework has 2 steps:
1. propagate constraints bottom up.
2. use constraints to infer additional filters for better data pruning.

For step 2, it mostly helps with Join, because we can connect the constraints from children to the join condition and infer powerful filters to prune the data of the join sides. e.g., the left side has constraints `a = 1`, the join condition is `left.a = right.a`, then we can infer `right.a = 1` to the right side and prune the right side a lot.

However, the current logic of inferring filters from constraints for Join is pretty weak. It infers the filters from Join's constraints. Some joins like left semi/anti exclude output from right side and the right side constraints will be lost here.

This PR propose to check the left and right constraints individually, expand the constraints with join condition and add filters to children of join directly, instead of adding to the join condition.

This reverts apache#20670 , covers apache#20717 and apache#20816

This is inspired by the original PRs and the tests are all from these PRs. Thanks to the authors mgaido91 maryannxue KaiXinXiaoLei !

## How was this patch tested?

new tests

Author: Wenchen Fan <wenchen@databricks.com>

Closes apache#21083 from cloud-fan/join.
@mgaido91 mgaido91 closed this Apr 23, 2018
@mgaido91 mgaido91 deleted the SPARK-23564 branch April 23, 2018 12:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants