-
Notifications
You must be signed in to change notification settings - Fork 29.1k
[SPARK-35982][SQL] Allow from_json/to_json for map types where value types are year-month intervals #33181
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Test build #140555 has finished for PR 33181 at commit
|
|
retest this please. |
|
Kubernetes integration test starting |
|
Test build #140568 has finished for PR 33181 at commit
|
|
retest this please. |
|
Test build #140574 has finished for PR 33181 at commit
|
|
retest this please. |
|
cc @MaxGekk FYI |
|
Kubernetes integration test status success |
|
Kubernetes integration test starting |
|
Kubernetes integration test status success |
|
Kubernetes integration test unable to build dist. exiting with code: 1 |
|
Test build #140578 has finished for PR 33181 at commit
|
MaxGekk
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1, LGTM. Merging to master/3.2.
Thank you, @sarutak .
…types are year-month intervals
### What changes were proposed in this pull request?
This PR fixes two issues. One is that `to_json` doesn't support `map` types where value types are `year-month` interval types like:
```
spark-sql> select to_json(map('a', interval '1-2' year to month));
21/07/02 11:38:15 ERROR SparkSQLDriver: Failed in [select to_json(map('a', interval '1-2' year to month))]
java.lang.RuntimeException: Failed to convert value 14 (class of class java.lang.Integer) with the type of YearMonthIntervalType(0,1) to JSON.
```
The other issue is that even if the issue of `to_json` is resolved, `from_json` doesn't support to convert `year-month` interval string to JSON. So the result of following query will be `null`.
```
spark-sql> select from_json(to_json(map('a', interval '1-2' year to month)), 'a interval year to month');
{"a":null}
```
### Why are the changes needed?
There should be no reason why year-month intervals cannot used as map value types.
`CalendarIntervalTypes` can do it.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
New tests.
Closes #33181 from sarutak/map-json-yminterval.
Authored-by: Kousuke Saruta <sarutak@oss.nttdata.com>
Signed-off-by: Max Gekk <max.gekk@gmail.com>
(cherry picked from commit 6474226)
Signed-off-by: Max Gekk <max.gekk@gmail.com>
What changes were proposed in this pull request?
This PR fixes two issues. One is that
to_jsondoesn't supportmaptypes where value types areyear-monthinterval types like:The other issue is that even if the issue of
to_jsonis resolved,from_jsondoesn't support to convertyear-monthinterval string to JSON. So the result of following query will benull.Why are the changes needed?
There should be no reason why year-month intervals cannot used as map value types.
CalendarIntervalTypescan do it.Does this PR introduce any user-facing change?
No.
How was this patch tested?
New tests.