-
Notifications
You must be signed in to change notification settings - Fork 28k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-35916][SQL] Support subtraction among Date/Timestamp/TimestampWithoutTZ #33115
[SPARK-35916][SQL] Support subtraction among Date/Timestamp/TimestampWithoutTZ #33115
Conversation
sql/catalyst/src/main/scala/org/apache/spark/sql/types/AbstractDataType.scala
Show resolved
Hide resolved
@@ -386,8 +386,8 @@ class Analyzer(override val catalogManager: CatalogManager) | |||
DatetimeSub(l, r, DateAddInterval(l, UnaryMinus(r, f), ansiEnabled = f)) | |||
case (_, CalendarIntervalType | _: DayTimeIntervalType) => | |||
Cast(DatetimeSub(l, r, TimeAdd(l, UnaryMinus(r, f))), l.dataType) | |||
case (TimestampType, _) => SubtractTimestamps(l, r) | |||
case (_, TimestampType) => SubtractTimestamps(l, r) | |||
case (TimestampType | TimestampWithoutTZType, _) => SubtractTimestamps(l, r) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be case (AnyTimestampType(), _)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No. But we can simplify it with
case _ if AnyTimestampType.unapply(l) || AnyTimestampType.unapply(r) =>
Merging to master |
### What changes were proposed in this pull request? Replace the type collection `AllTimestampTypes` with the new data type `AnyTimestampType` ### Why are the changes needed? As discussed in #33115 (comment), it is more convenient to have a new data type "AnyTimestampType" instead of using type collection `AllTimestampTypes`: 1. simplify the pattern match 2. In the default type coercion rules, when implicit casting a type to a TypeCollection type, Spark chooses the first convertible data type as the result. If we are going to make the default timestamp type configurable, having AnyTimestampType is better ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Existing UT Closes #33129 from gengliangwang/allTimestampTypes. Authored-by: Gengliang Wang <gengliang@apache.org> Signed-off-by: Gengliang Wang <gengliang@apache.org>
Refer to this link for build results (access rights to CI server needed): |
What changes were proposed in this pull request?
Support the following operations:
For subtraction between
TimestampWithoutTZ
andTimestamp
, theTimestamp
column is cast as TimestampWithoutTZType.Why are the changes needed?
Support basic subtraction among Date/Timestamp/TimestampWithoutTZ.
Does this PR introduce any user-facing change?
No, the timestamp without time zone type is not release yet.
How was this patch tested?
Unit tests