Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-40152][SQL][TESTS] Add tests for SplitPart #37626

Closed
wants to merge 1 commit into from

Conversation

wangyum
Copy link
Member

@wangyum wangyum commented Aug 23, 2022

What changes were proposed in this pull request?

Add tests for SplitPart.

Why are the changes needed?

Improve test coverage.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

N/A.

@github-actions github-actions bot added the SQL label Aug 23, 2022
@@ -2522,4 +2522,24 @@ class CollectionExpressionsSuite extends SparkFunSuite with ExpressionEvalHelper
Date.valueOf("2017-02-12")))
}
}

test("SplitPart") {
Copy link
Member Author

@wangyum wangyum Aug 23, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems can't test SplitPart. So just test replacement:

override lazy val replacement: Expression =
ElementAt(StringSplitSQL(str, delimiter), partNum, Some(Literal.create("", StringType)),
false)

checkEvaluation(SplitPart(Literal.create("11.12.13", StringType), Literal.create(".", StringType), Literal(3)), "13")
Caused by: org.apache.spark.SparkUnsupportedOperationException: [INTERNAL_ERROR] Cannot evaluate expression: split_part(11.12.13, ., 3)
	at org.apache.spark.sql.errors.QueryExecutionErrors$.cannotEvaluateExpressionError(QueryExecutionErrors.scala:71)
	at org.apache.spark.sql.catalyst.expressions.RuntimeReplaceable.eval(Expression.scala:369)
	at org.apache.spark.sql.catalyst.expressions.RuntimeReplaceable.eval$(Expression.scala:368)
	at org.apache.spark.sql.catalyst.expressions.SplitPart.eval(stringExpressions.scala:2859)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we simply add more tests in the existing test case test("elementAt")? We just need to test the outOfBoundValue

checkEvaluation(ElementAt(StringSplitSQL(Literal.create("11.12.13", StringType),
Literal.create(null, StringType)), Literal(1), outOfBoundValue), null)

intercept[Exception] {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we catch narrower exception?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mean TestFailedException?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, let's catch a narrower exception.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

org.scalatest.exceptions.TestFailedException is scala test exception, not Spark exception. It seems meaningless:

    intercept[SparkRuntimeException] {
      checkEvaluation(ElementAt(str, Literal(0), outOfBoundValue), null)
    }.getMessage.contains("The index 0 is invalid")
Expected exception org.apache.spark.SparkRuntimeException to be thrown, but org.scalatest.exceptions.TestFailedException was thrown
ScalaTestFailureLocation: org.apache.spark.sql.catalyst.expressions.CollectionExpressionsSuite at (CollectionExpressionsSuite.scala:2543)
org.scalatest.exceptions.TestFailedException: Expected exception org.apache.spark.SparkRuntimeException to be thrown, but org.scalatest.exceptions.TestFailedException was thrown
	at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
	at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
	at org.scalatest.funsuite.AnyFunSuite.newAssertionFailedException(AnyFunSuite.scala:1563)

srowen pushed a commit that referenced this pull request Aug 23, 2022
### What changes were proposed in this pull request?

Add tests for `SplitPart`.

### Why are the changes needed?

Improve test coverage.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

N/A.

Closes #37626 from wangyum/SPARK-40152-2.

Authored-by: Yuming Wang <yumwang@ebay.com>
Signed-off-by: Sean Owen <srowen@gmail.com>
(cherry picked from commit 4f525ee)
Signed-off-by: Sean Owen <srowen@gmail.com>
@srowen
Copy link
Member

srowen commented Aug 23, 2022

Merged to master/3.3

@srowen srowen closed this in 4f525ee Aug 23, 2022
@wangyum wangyum deleted the SPARK-40152-2 branch August 23, 2022 14:29
Literal.create(null, StringType)), Literal(1), outOfBoundValue), null)

intercept[Exception] {
checkEvaluation(ElementAt(str, Literal(0), outOfBoundValue), null)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have a dedicated function checkExceptionInExpression for testing the expression error behavior.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually this is already tested in test("elementAt")

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See https://github.com/apache/spark/pull/37626/files#r952639395

I think we should just expand the existing test("elementAt").

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen
Copy link
Member

srowen commented Aug 23, 2022

Ah sorry did it again, merged this already. If it's important make another PR and I'll merge it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
4 participants