Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-40615][SQL][TESTS][FOLLOW-UP] Make the test pass with ANSI enabled #38325

Closed
wants to merge 1 commit into from

Conversation

HyukjinKwon
Copy link
Member

What changes were proposed in this pull request?

This PR proposes the make the tests added in #38050 pass with ANSI mode enabled by avoiding string binary operations.

Why are the changes needed?

To make the tests pass with ANSI enabled on. Currently, it fails as below (https://github.com/apache/spark/actions/runs/3286184541/jobs/5414029918):

[info] - SPARK-40615: Check unsupported data type when decorrelating subqueries *** FAILED *** (118 milliseconds)
[info]   "[DATATYPE_MISMATCH.BINARY_OP_WRONG_TYPE] Cannot resolve "(a + a)" due to data type mismatch: the binary operator requires the input type ("NUMERIC" or "INTERVAL DAY TO SECOND" or "INTERVAL YEAR TO MONTH" or "INTERVAL"), not "STRING".; line 1 pos 15;
[info]   'Project [unresolvedalias(scalar-subquery#426412 [], None)]
[info]   :  +- 'Project [unresolvedalias((a#426411 + a#426411), None)]
[info]   :     +- SubqueryAlias __auto_generated_subquery_name
[info]   :        +- Project [upper(cast(outer(x#426413)[a] as string)) AS a#426411]
[info]   :           +- OneRowRelation
[info]   +- SubqueryAlias v1
[info]      +- View (`v1`, [x#426413])
[info]         +- Project [cast(x#426414 as map<string,int>) AS x#426413]
[info]            +- SubqueryAlias t
[info]               +- LocalRelation [x#426414]
[info]   " did not contain "Correlated column reference 'v1.x' cannot be map type" (SubquerySuite.scala:2480)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
[info]   at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
[info]   at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
[info]   at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
[info]   at org.apache.spark.sql.SubquerySuite.$anonfun$new$320(SubquerySuite.scala:2480)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
[info]   at org.apache.spark.sql.test.SQLTestUtilsBase.withTempView(SQLTestUtils.scala:276)
[info]   at org.apache.spark.sql.test.SQLTestUtilsBase.withTempView$(SQLTestUtils.scala:274)
[info]   at org.apache.spark.sql.SubquerySuite.withTempView(SubquerySuite.scala:32)
[info]   at org.apache.spark.sql.SubquerySuite.$anonfun$new$319(SubquerySuite.scala:2459)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)

Does this PR introduce any user-facing change?

No, test-only.

How was this patch tested?

Manually ran the tests and verified that it passes.

@HyukjinKwon
Copy link
Member Author

cc @cloud-fan and @allisonwang-db

@github-actions github-actions bot added the SQL label Oct 21, 2022
@HyukjinKwon
Copy link
Member Author

Merged to master.

@allisonwang-db
Copy link
Contributor

Thanks for fixing this!

SandishKumarHN pushed a commit to SandishKumarHN/spark that referenced this pull request Dec 12, 2022
…bled

### What changes were proposed in this pull request?

This PR proposes the make the tests added in apache#38050 pass with ANSI mode enabled by avoiding string binary operations.

### Why are the changes needed?

To make the tests pass with ANSI enabled on. Currently, it fails as below (https://github.com/apache/spark/actions/runs/3286184541/jobs/5414029918):

```
[info] - SPARK-40615: Check unsupported data type when decorrelating subqueries *** FAILED *** (118 milliseconds)
[info]   "[DATATYPE_MISMATCH.BINARY_OP_WRONG_TYPE] Cannot resolve "(a + a)" due to data type mismatch: the binary operator requires the input type ("NUMERIC" or "INTERVAL DAY TO SECOND" or "INTERVAL YEAR TO MONTH" or "INTERVAL"), not "STRING".; line 1 pos 15;
[info]   'Project [unresolvedalias(scalar-subquery#426412 [], None)]
[info]   :  +- 'Project [unresolvedalias((a#426411 + a#426411), None)]
[info]   :     +- SubqueryAlias __auto_generated_subquery_name
[info]   :        +- Project [upper(cast(outer(x#426413)[a] as string)) AS a#426411]
[info]   :           +- OneRowRelation
[info]   +- SubqueryAlias v1
[info]      +- View (`v1`, [x#426413])
[info]         +- Project [cast(x#426414 as map<string,int>) AS x#426413]
[info]            +- SubqueryAlias t
[info]               +- LocalRelation [x#426414]
[info]   " did not contain "Correlated column reference 'v1.x' cannot be map type" (SubquerySuite.scala:2480)
[info]   org.scalatest.exceptions.TestFailedException:
[info]   at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)
[info]   at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)
[info]   at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231)
[info]   at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295)
[info]   at org.apache.spark.sql.SubquerySuite.$anonfun$new$320(SubquerySuite.scala:2480)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1491)
[info]   at org.apache.spark.sql.test.SQLTestUtilsBase.withTempView(SQLTestUtils.scala:276)
[info]   at org.apache.spark.sql.test.SQLTestUtilsBase.withTempView$(SQLTestUtils.scala:274)
[info]   at org.apache.spark.sql.SubquerySuite.withTempView(SubquerySuite.scala:32)
[info]   at org.apache.spark.sql.SubquerySuite.$anonfun$new$319(SubquerySuite.scala:2459)
[info]   at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
[info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
[info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
```

### Does this PR introduce _any_ user-facing change?

No, test-only.

### How was this patch tested?

Manually ran the tests and verified that it passes.

Closes apache#38325 from HyukjinKwon/SPARK-40615-followup.

Authored-by: Hyukjin Kwon <gurwls223@apache.org>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
@HyukjinKwon HyukjinKwon deleted the SPARK-40615-followup branch January 15, 2024 00:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
3 participants