-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-20198] [SQL] Remove the inconsistency in table/function name conventions in SparkSession.Catalog APIs #17518
Conversation
Test build #75473 has finished for PR 17518 at commit
|
|
||
// Find an unqualified table using the current database | ||
assert(!spark.catalog.tableExists("tbl_y")) | ||
spark.catalog.setCurrentDatabase(db) | ||
assert(spark.catalog.tableExists("tbl_y")) | ||
|
||
// Unable to find the table, although the temp view with the given name exists | ||
assert(!spark.catalog.tableExists(db, "tbl_x")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This new test case is not related to this PR. Just to improve the existing test case coverage.
|
||
// Find an unqualified function using the current database | ||
assert(!spark.catalog.functionExists("fn2")) | ||
spark.catalog.setCurrentDatabase(db) | ||
assert(spark.catalog.functionExists("fn2")) | ||
|
||
// Unable to find the function, although the temp function with the given name exists | ||
assert(!spark.catalog.functionExists(db, "fn1")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This new test case is not related to this PR. Just to improve the existing test case coverage.
Test build #75474 has finished for PR 17518 at commit
|
Is this an API change or just a documentation change? The title suggests you are changing public facing APIs? |
@rxin This has an API change. After this PR, the following APIs will accept the qualified table/function names.
For example, before this PR, |
@@ -383,36 +439,48 @@ abstract class Catalog { | |||
* preserved database `global_temp`, and we must use the qualified name to refer a global temp | |||
* view, e.g. `SELECT * FROM global_temp.view1`. | |||
* | |||
* @param viewName the name of the view to be dropped. | |||
* @param viewName the unqualified name of the temporary view to be dropped. | |||
* @return true if the view is dropped successfully, false otherwise. | |||
* @since 2.1.0 | |||
*/ | |||
def dropGlobalTempView(viewName: String): Boolean |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm, just realized we don't have dropTable
in Catalog
...
@rxin I think these API changes make sense, it's more intuitive to return columns for LGTM. merging to master! |
What changes were proposed in this pull request?
Observed by @felixcheung , in
SparkSession
.Catalog
APIs, we have different conventions/rules for table/function identifiers/names. Most APIs accept the qualified name (i.e.,databaseName
.tableName
ordatabaseName
.functionName
). However, the following five APIs do not accept it.To make them consistent with the other Catalog APIs, this PR does the changes, updates the function/API comments and adds the
@params
to clarify the inputs we allow.How was this patch tested?
Added the test cases .