Skip to content

Commit

Permalink
Review comment fix
Browse files Browse the repository at this point in the history
  • Loading branch information
iRakson committed Feb 12, 2020
1 parent 1a066e1 commit e873527
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 6 deletions.
2 changes: 1 addition & 1 deletion docs/sql-migration-guide.md
Expand Up @@ -216,7 +216,7 @@ license: |

- Since Spark 3.0, the `size` function returns `NULL` for the `NULL` input. In Spark version 2.4 and earlier, this function gives `-1` for the same input. To restore the behavior before Spark 3.0, you can set `spark.sql.legacy.sizeOfNull` to `true`.

- Since Spark 3.0, when the `array`/`map` function is called without any parameters, it returns an empty collection with `NullType` as element type. In Spark version 2.4 and earlier, it returns an empty collection with `StringType` as element type. To restore the behavior before Spark 3.0, you can set `spark.sql.legacy.createEmptyCollectionUsingStringType.enabled` to `true`.
- Since Spark 3.0, when the `array`/`map` function is called without any parameters, it returns an empty collection with `NullType` as element type. In Spark version 2.4 and earlier, it returns an empty collection with `StringType` as element type. To restore the behavior before Spark 3.0, you can set `spark.sql.legacy.createEmptyCollectionUsingStringType` to `true`.

- Since Spark 3.0, the interval literal syntax does not allow multiple from-to units anymore. For example, `SELECT INTERVAL '1-1' YEAR TO MONTH '2-2' YEAR TO MONTH'` throws parser exception.

Expand Down
Expand Up @@ -2008,12 +2008,11 @@ object SQLConf {
.createWithDefault(false)

val LEGACY_CREATE_EMPTY_COLLECTION_USING_STRING_TYPE =
buildConf("spark.sql.legacy.createEmptyCollectionUsingStringType.enabled")
buildConf("spark.sql.legacy.createEmptyCollectionUsingStringType")
.internal()
.doc("When set to true, it returns an empty array of string type and an empty map with " +
"string type as key/value type when `array` and `map` functions are called without any " +
"parameters, respectively. Otherwise, it returns an empty array of `NullType` and " +
"empty map with `NullType` as key/value type, respectively. ")
.doc("When set to true, Spark returns an empty collection with `StringType` as element " +
"type if the `array`/`map` function is called without any parameters. Otherwise, Spark " +
"returns an empty collection with `NullType` as element type.")
.booleanConf
.createWithDefault(false)

Expand Down

0 comments on commit e873527

Please sign in to comment.