Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-36920][SQL] Support ANSI intervals by ABS() #34169

Closed
wants to merge 3 commits into from

Conversation

MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Oct 4, 2021

What changes were proposed in this pull request?

In the PR, I propose to handle ANSI interval types by the Abs expression, and the abs() function as a consequence of that:

  • for positive and zero intervals, ABS() returns the same input value,
  • for minimal supported values (Int.MinValue months for year-month interval and Long.MinValue microseconds for day-time interval), ABS() throws the arithmetic overflow exception.
  • for other supported negative intervals, ABS() negate its input and returns a positive interval.

For example:

spark-sql> SELECT ABS(INTERVAL -'10-8' YEAR TO MONTH);
10-8
spark-sql> SELECT ABS(INTERVAL '-10 01:02:03.123456' DAY TO SECOND);
10 01:02:03.123456000

Why are the changes needed?

To improve user experience with Spark SQL.

Does this PR introduce any user-facing change?

No, this PR just extends ABS() by supporting new types.

How was this patch tested?

By running new tests:

$ build/sbt "test:testOnly *ArithmeticExpressionSuite"
$ build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite -- -z interval.sql"
$ build/sbt "sql/test:testOnly org.apache.spark.sql.expressions.ExpressionInfoSuite"

@github-actions github-actions bot added the SQL label Oct 4, 2021
@SparkQA
Copy link

SparkQA commented Oct 4, 2021

Kubernetes integration test starting
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/48327/

@SparkQA
Copy link

SparkQA commented Oct 4, 2021

Kubernetes integration test status failure
URL: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder-K8s/48327/

@MaxGekk MaxGekk changed the title [WIP][SPARK-36920][SQL] Support ANSI intervals by ABS() [SPARK-36920][SQL] Support ANSI intervals by ABS() Oct 4, 2021
@MaxGekk MaxGekk marked this pull request as ready for review October 4, 2021 15:08
@MaxGekk
Copy link
Member Author

MaxGekk commented Oct 4, 2021

@beliefer @AngersZhuuuu @Peng-Lei @cloud-fan Could you review this PR, please.

@SparkQA
Copy link

SparkQA commented Oct 4, 2021

Test build #143814 has finished for PR 34169 at commit d89ec8d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

Merged to master.

@@ -160,11 +162,15 @@ case class Abs(child: Expression, failOnError: Boolean = SQLConf.get.ansiEnabled

def this(child: Expression) = this(child, SQLConf.get.ansiEnabled)

override def inputTypes: Seq[AbstractDataType] = Seq(NumericType)
override def inputTypes: Seq[AbstractDataType] = Seq(TypeCollection.NumericAndInterval)
Copy link
Contributor

@cloud-fan cloud-fan Oct 5, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We shouldn't use TypeCollection.NumericAndInterval, as it includes the legacy interval. We will hit runtime exception if we use legacy interval as input, but analysis exception is preferred.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan Thanks for the comment. Here is the fix #34183

MaxGekk added a commit that referenced this pull request Oct 5, 2021
…ANSI intervals

### What changes were proposed in this pull request?
Change allowed input types of `Abs()` from:
```
NumericType + CalendarIntervalType + YearMonthIntervalType + DayTimeIntervalType
```
to
```
NumericType + YearMonthIntervalType + DayTimeIntervalType
```

### Why are the changes needed?
The changes make the error message more clear.

Before changes:
```sql
spark-sql> set spark.sql.legacy.interval.enabled=true;
spark.sql.legacy.interval.enabled	true
spark-sql> select abs(interval -10 days -20 minutes);
21/10/05 09:11:30 ERROR SparkSQLDriver: Failed in [select abs(interval -10 days -20 minutes)]
java.lang.ClassCastException: org.apache.spark.sql.types.CalendarIntervalType$ cannot be cast to org.apache.spark.sql.types.NumericType
	at org.apache.spark.sql.catalyst.util.TypeUtils$.getNumeric(TypeUtils.scala:77)
	at org.apache.spark.sql.catalyst.expressions.Abs.numeric$lzycompute(arithmetic.scala:172)
	at org.apache.spark.sql.catalyst.expressions.Abs.numeric(arithmetic.scala:169)
```

After:
```sql
spark.sql.legacy.interval.enabled	true
spark-sql> select abs(interval -10 days -20 minutes);
Error in query: cannot resolve 'abs(INTERVAL '-10 days -20 minutes')' due to data type mismatch: argument 1 requires (numeric or interval day to second or interval year to month) type, however, 'INTERVAL '-10 days -20 minutes'' is of interval type.; line 1 pos 7;
'Project [unresolvedalias(abs(-10 days -20 minutes, false), None)]
+- OneRowRelation
```

### Does this PR introduce _any_ user-facing change?
No, because the original changes of #34169 haven't released yet.

### How was this patch tested?
Manually checked in the command line, see examples above.

Closes #34183 from MaxGekk/fix-abs-input-types.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: Max Gekk <max.gekk@gmail.com>
a0x8o added a commit to a0x8o/spark that referenced this pull request Oct 5, 2021
…ANSI intervals

### What changes were proposed in this pull request?
Change allowed input types of `Abs()` from:
```
NumericType + CalendarIntervalType + YearMonthIntervalType + DayTimeIntervalType
```
to
```
NumericType + YearMonthIntervalType + DayTimeIntervalType
```

### Why are the changes needed?
The changes make the error message more clear.

Before changes:
```sql
spark-sql> set spark.sql.legacy.interval.enabled=true;
spark.sql.legacy.interval.enabled	true
spark-sql> select abs(interval -10 days -20 minutes);
21/10/05 09:11:30 ERROR SparkSQLDriver: Failed in [select abs(interval -10 days -20 minutes)]
java.lang.ClassCastException: org.apache.spark.sql.types.CalendarIntervalType$ cannot be cast to org.apache.spark.sql.types.NumericType
	at org.apache.spark.sql.catalyst.util.TypeUtils$.getNumeric(TypeUtils.scala:77)
	at org.apache.spark.sql.catalyst.expressions.Abs.numeric$lzycompute(arithmetic.scala:172)
	at org.apache.spark.sql.catalyst.expressions.Abs.numeric(arithmetic.scala:169)
```

After:
```sql
spark.sql.legacy.interval.enabled	true
spark-sql> select abs(interval -10 days -20 minutes);
Error in query: cannot resolve 'abs(INTERVAL '-10 days -20 minutes')' due to data type mismatch: argument 1 requires (numeric or interval day to second or interval year to month) type, however, 'INTERVAL '-10 days -20 minutes'' is of interval type.; line 1 pos 7;
'Project [unresolvedalias(abs(-10 days -20 minutes, false), None)]
+- OneRowRelation
```

### Does this PR introduce _any_ user-facing change?
No, because the original changes of apache/spark#34169 haven't released yet.

### How was this patch tested?
Manually checked in the command line, see examples above.

Closes #34183 from MaxGekk/fix-abs-input-types.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: Max Gekk <max.gekk@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants