Skip to content

Commit

Permalink
[SPARK-30234][SQL][FOLLOWUP] Add .enabled in the suffix of the ADD …
Browse files Browse the repository at this point in the history
…FILE legacy option

### What changes were proposed in this pull request?

This pr intends to rename `spark.sql.legacy.addDirectory.recursive` into `spark.sql.legacy.addDirectory.recursive.enabled`.

### Why are the changes needed?

For consistent option names.

### Does this PR introduce any user-facing change?

No.

### How was this patch tested?

N/A

Closes #27372 from maropu/SPARK-30234-FOLLOWUP.

Authored-by: Takeshi Yamamuro <yamamuro@apache.org>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
  • Loading branch information
maropu authored and HyukjinKwon committed Jan 29, 2020
1 parent 298d0a5 commit ec1fb6b
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 8 deletions.
2 changes: 1 addition & 1 deletion docs/sql-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -324,7 +324,7 @@ license: |
</tr>
</table>

- Since Spark 3.0, `ADD FILE` can be used to add file directories as well. Earlier only single files can be added using this command. To restore the behaviour of earlier versions, set `spark.sql.legacy.addDirectory.recursive` to false.
- Since Spark 3.0, `ADD FILE` can be used to add file directories as well. Earlier only single files can be added using this command. To restore the behaviour of earlier versions, set `spark.sql.legacy.addDirectory.recursive.enabled` to false.

- Since Spark 3.0, `SHOW TBLPROPERTIES` will cause `AnalysisException` if the table does not exist. In Spark version 2.4 and earlier, this scenario caused `NoSuchTableException`. Also, `SHOW TBLPROPERTIES` on a temporary view will cause `AnalysisException`. In Spark version 2.4 and earlier, it returned an empty result.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2116,11 +2116,13 @@ object SQLConf {
.booleanConf
.createWithDefault(false)

val LEGACY_ADD_DIRECTORY_USING_RECURSIVE = buildConf("spark.sql.legacy.addDirectory.recursive")
.doc("When true, users can add directory by passing path of a directory to ADD FILE " +
"command of SQL. If false, then only a single file can be added.")
.booleanConf
.createWithDefault(true)
val LEGACY_ADD_DIRECTORY_USING_RECURSIVE =
buildConf("spark.sql.legacy.addDirectory.recursive.enabled")
.internal()
.doc("When true, users can add directory by passing path of a directory to ADD FILE " +
"command of SQL. If false, then only a single file can be added.")
.booleanConf
.createWithDefault(true)

val LEGACY_MSSQLSERVER_NUMERIC_MAPPING_ENABLED =
buildConf("spark.sql.legacy.mssqlserver.numericMapping.enabled")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2966,14 +2966,14 @@ abstract class DDLSuite extends QueryTest with SQLTestUtils {
}
}

test("Add a directory when spark.sql.legacy.addDirectory.recursive set to true") {
test("Add a directory when spark.sql.legacy.addDirectory.recursive.enabled set to true") {
val directoryToAdd = Utils.createTempDir("/tmp/spark/addDirectory/")
val testFile = File.createTempFile("testFile", "1", directoryToAdd)
spark.sql(s"ADD FILE $directoryToAdd")
assert(new File(SparkFiles.get(s"${directoryToAdd.getName}/${testFile.getName}")).exists())
}

test("Add a directory when spark.sql.legacy.addDirectory.recursive not set to true") {
test("Add a directory when spark.sql.legacy.addDirectory.recursive.enabled not set to true") {
withTempDir { testDir =>
withSQLConf(SQLConf.LEGACY_ADD_DIRECTORY_USING_RECURSIVE.key -> "false") {
val msg = intercept[SparkException] {
Expand Down

0 comments on commit ec1fb6b

Please sign in to comment.