Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-31999][SQL] Add REFRESH FUNCTION command #28840

Closed
wants to merge 46 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
69a47a1
init
ulysses-you Jun 16, 2020
a95dcb6
update doc
ulysses-you Jun 17, 2020
3fc807e
fix typo
ulysses-you Jun 17, 2020
b282348
update doc
ulysses-you Jun 17, 2020
f677a4a
update doc again
ulysses-you Jun 17, 2020
a6c5d8b
use v2 command
ulysses-you Jun 17, 2020
de54470
fix
ulysses-you Jun 17, 2020
9e09875
fix mistake
ulysses-you Jun 17, 2020
63695c0
use v2 commnd analyze
ulysses-you Jun 17, 2020
9e9d5ce
add line
ulysses-you Jun 17, 2020
c434821
update doc
ulysses-you Jun 18, 2020
35fd44b
update doc
ulysses-you Jun 18, 2020
f83fd8b
fix child
ulysses-you Jun 18, 2020
e444943
fix children
ulysses-you Jun 18, 2020
afd510b
add comment
ulysses-you Jun 18, 2020
1241bde
fix copy error
ulysses-you Jun 18, 2020
93f5d71
update doc
ulysses-you Jun 18, 2020
dc684b5
update comment
ulysses-you Jun 18, 2020
0ea7dd6
fix LookupCatalog
ulysses-you Jun 18, 2020
643969c
merge to ResolveFunctions
ulysses-you Jun 18, 2020
6cb2edd
remove ignoreIfNotExists
ulysses-you Jun 19, 2020
cffc207
fix ut
ulysses-you Jun 22, 2020
4b6408d
fix resolve
ulysses-you Jun 22, 2020
5d5fe71
brush functions
ulysses-you Jun 22, 2020
4ba345b
fix
ulysses-you Jun 22, 2020
6765395
use catalogfunction
ulysses-you Jun 22, 2020
dc86b82
fix
ulysses-you Jun 23, 2020
a38d656
fix comment
ulysses-you Jun 23, 2020
cdea55b
ut nit
ulysses-you Jun 24, 2020
5e227d7
fix nit
ulysses-you Jun 24, 2020
703ad47
nit
ulysses-you Jun 24, 2020
a79f72b
update ResolvedFunc
ulysses-you Jun 24, 2020
a4d144a
Merge branch 'master' of https://github.com/apache/spark into SPARK-3…
ulysses-you Jul 6, 2020
3bd8d23
update doc
ulysses-you Jul 6, 2020
60ac2a0
fix doc
ulysses-you Jul 6, 2020
b36b760
update comment
ulysses-you Jul 6, 2020
c5937a2
rewrite RefreshFunctionCommand
ulysses-you Jul 6, 2020
56ec5ea
update doc
ulysses-you Jul 13, 2020
c129a54
fix functions
ulysses-you Jul 14, 2020
a956144
fix
ulysses-you Jul 14, 2020
711656d
remove unnecessary param
ulysses-you Jul 14, 2020
5d4c152
simplify
ulysses-you Jul 16, 2020
94fa132
fix
ulysses-you Jul 17, 2020
fc4789f
simplify
ulysses-you Jul 17, 2020
e83194f
address comment
ulysses-you Jul 21, 2020
b18437c
fix
ulysses-you Jul 21, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 2 additions & 0 deletions docs/_data/menu-sql.yaml
Expand Up @@ -208,6 +208,8 @@
url: sql-ref-syntax-aux-cache-clear-cache.html
- text: REFRESH TABLE
url: sql-ref-syntax-aux-cache-refresh-table.html
- text: REFRESH FUNCTION
url: sql-ref-syntax-aux-cache-refresh-function.html
- text: REFRESH
url: sql-ref-syntax-aux-cache-refresh.html
- text: DESCRIBE
Expand Down
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-cache-table.md
Expand Up @@ -80,3 +80,4 @@ CACHE TABLE testCache OPTIONS ('storageLevel' 'DISK_ONLY') SELECT * FROM testDat
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-clear-cache.md
Expand Up @@ -41,3 +41,4 @@ CLEAR CACHE;
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
60 changes: 60 additions & 0 deletions docs/sql-ref-syntax-aux-cache-refresh-function.md
@@ -0,0 +1,60 @@
---
layout: global
title: REFRESH FUNCTION
displayTitle: REFRESH FUNCTION
license: |
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
---

### Description

`REFRESH FUNCTION` statement invalidates the cached function entry, which includes a class name
and resource location of the given function. The invalidated cache is populated right away.
Note that `REFRESH FUNCTION` only works for permanent functions. Refreshing native functions or temporary functions will cause an exception.

### Syntax

```sql
REFRESH FUNCTION function_identifier
```

### Parameters

* **function_identifier**

Specifies a function name, which is either a qualified or unqualified name. If no database identifier is provided, uses the current database.

**Syntax:** `[ database_name. ] function_name`

### Examples

```sql
-- The cached entry of the function will be refreshed
-- The function is resolved from the current database as the function name is unqualified.
REFRESH FUNCTION func1;

-- The cached entry of the function will be refreshed
-- The function is resolved from tempDB database as the function name is qualified.
REFRESH FUNCTION db1.func1;
```

### Related Statements

* [CACHE TABLE](sql-ref-syntax-aux-cache-cache-table.html)
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-refresh-table.md
Expand Up @@ -57,3 +57,4 @@ REFRESH TABLE tempDB.view1;
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-refresh.md
Expand Up @@ -54,3 +54,4 @@ REFRESH "hdfs://path/to/table";
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [UNCACHE TABLE](sql-ref-syntax-aux-cache-uncache-table.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax-aux-cache-uncache-table.md
Expand Up @@ -50,3 +50,4 @@ UNCACHE TABLE t1;
* [CLEAR CACHE](sql-ref-syntax-aux-cache-clear-cache.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
3 changes: 2 additions & 1 deletion docs/sql-ref-syntax-aux-cache.md
Expand Up @@ -23,4 +23,5 @@ license: |
* [UNCACHE TABLE statement](sql-ref-syntax-aux-cache-uncache-table.html)
* [CLEAR CACHE statement](sql-ref-syntax-aux-cache-clear-cache.html)
* [REFRESH TABLE statement](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH statement](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH statement](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH FUNCTION statement](sql-ref-syntax-aux-cache-refresh-function.html)
1 change: 1 addition & 0 deletions docs/sql-ref-syntax.md
Expand Up @@ -83,6 +83,7 @@ Spark SQL is Apache Spark's module for working with structured data. The SQL Syn
* [LIST JAR](sql-ref-syntax-aux-resource-mgmt-list-jar.html)
* [REFRESH](sql-ref-syntax-aux-cache-refresh.html)
* [REFRESH TABLE](sql-ref-syntax-aux-cache-refresh-table.html)
* [REFRESH FUNCTION](sql-ref-syntax-aux-cache-refresh-function.html)
* [RESET](sql-ref-syntax-aux-conf-mgmt-reset.html)
* [SET](sql-ref-syntax-aux-conf-mgmt-set.html)
* [SHOW COLUMNS](sql-ref-syntax-aux-show-columns.html)
Expand Down
Expand Up @@ -229,6 +229,7 @@ statement
comment=(STRING | NULL) #commentNamespace
| COMMENT ON TABLE multipartIdentifier IS comment=(STRING | NULL) #commentTable
| REFRESH TABLE multipartIdentifier #refreshTable
| REFRESH FUNCTION multipartIdentifier #refreshFunction
cloud-fan marked this conversation as resolved.
Show resolved Hide resolved
| REFRESH (STRING | .*?) #refreshResource
| CACHE LAZY? TABLE multipartIdentifier
(OPTIONS options=tablePropertyList)? (AS? query)? #cacheTable
Expand Down
Expand Up @@ -1886,11 +1886,17 @@ class Analyzer(
}

/**
* Replaces [[UnresolvedFunc]]s with concrete [[LogicalPlan]]s.
* Replaces [[UnresolvedFunction]]s with concrete [[Expression]]s.
*/
object ResolveFunctions extends Rule[LogicalPlan] {
val trimWarningEnabled = new AtomicBoolean(true)
def apply(plan: LogicalPlan): LogicalPlan = plan.resolveOperatorsUp {
// Resolve functions with concrete relations from v2 catalog.
case UnresolvedFunc(multipartIdent) =>
val funcIdent = parseSessionCatalogFunctionIdentifier(multipartIdent, "function lookup")
ResolvedFunc(Identifier.of(funcIdent.database.toArray, funcIdent.funcName))

case q: LogicalPlan =>
q transformExpressions {
case u if !u.childrenResolved => u // Skip until children are resolved.
Expand Down
Expand Up @@ -17,6 +17,7 @@

package org.apache.spark.sql.catalyst.analysis

import org.apache.spark.sql.catalyst.catalog.CatalogFunction
import org.apache.spark.sql.catalyst.expressions.Attribute
import org.apache.spark.sql.catalyst.plans.logical.{LeafNode, LogicalPlan}
import org.apache.spark.sql.connector.catalog.{CatalogPlugin, Identifier, SupportsNamespaces, Table, TableCatalog}
Expand Down Expand Up @@ -50,6 +51,15 @@ case class UnresolvedTableOrView(multipartIdentifier: Seq[String]) extends LeafN
override def output: Seq[Attribute] = Nil
}

/**
* Holds the name of a function that has yet to be looked up in a catalog. It will be resolved to
* [[ResolvedFunc]] during analysis.
*/
case class UnresolvedFunc(multipartIdentifier: Seq[String]) extends LeafNode {
override lazy val resolved: Boolean = false
override def output: Seq[Attribute] = Nil
}

/**
* A plan containing resolved namespace.
*/
Expand All @@ -74,3 +84,13 @@ case class ResolvedTable(catalog: TableCatalog, identifier: Identifier, table: T
case class ResolvedView(identifier: Identifier) extends LeafNode {
override def output: Seq[Attribute] = Nil
}

/**
* A plan containing resolved function.
*/
// TODO: create a generic representation for v1, v2 function, after we add function
// support to v2 catalog. For now we only need the identifier to fallback to v1 command.
case class ResolvedFunc(identifier: Identifier)
extends LeafNode {
override def output: Seq[Attribute] = Nil
}
Expand Up @@ -1341,6 +1341,14 @@ class SessionCatalog(
functionRegistry.registerFunction(func, info, builder)
}

/**
* Unregister a temporary or permanent function from a session-specific [[FunctionRegistry]]
* Return true if function exists.
*/
def unregisterFunction(name: FunctionIdentifier): Boolean = {
functionRegistry.dropFunction(name)
}

/**
* Drop a temporary function.
*/
Expand Down
Expand Up @@ -3642,6 +3642,11 @@ class AstBuilder(conf: SQLConf) extends SqlBaseBaseVisitor[AnyRef] with Logging
ctx.REPLACE != null)
}

override def visitRefreshFunction(ctx: RefreshFunctionContext): LogicalPlan = withOrigin(ctx) {
val functionIdentifier = visitMultipartIdentifier(ctx.multipartIdentifier)
RefreshFunction(UnresolvedFunc(functionIdentifier))
}

override def visitCommentNamespace(ctx: CommentNamespaceContext): LogicalPlan = withOrigin(ctx) {
val comment = ctx.comment.getType match {
case SqlBaseParser.NULL => ""
Expand Down
Expand Up @@ -516,3 +516,10 @@ case class CommentOnNamespace(child: LogicalPlan, comment: String) extends Comma
case class CommentOnTable(child: LogicalPlan, comment: String) extends Command {
override def children: Seq[LogicalPlan] = child :: Nil
}

/**
* The logical plan of the REFRESH FUNCTION command that works for v2 catalogs.
*/
case class RefreshFunction(child: LogicalPlan) extends Command {
override def children: Seq[LogicalPlan] = child :: Nil
}
Expand Up @@ -18,7 +18,7 @@
package org.apache.spark.sql.connector.catalog

import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
import org.apache.spark.sql.catalyst.catalog.BucketSpec
import org.apache.spark.sql.catalyst.parser.CatalystSqlParser
import org.apache.spark.sql.connector.expressions.{BucketTransform, IdentityTransform, LogicalExpressions, Transform}
Expand Down Expand Up @@ -107,6 +107,14 @@ private[sql] object CatalogV2Implicits {
throw new AnalysisException(
s"$quoted is not a valid TableIdentifier as it has more than 2 name parts.")
}

def asFunctionIdentifier: FunctionIdentifier = ident.namespace() match {
case ns if ns.isEmpty => FunctionIdentifier(ident.name())
case Array(dbName) => FunctionIdentifier(ident.name(), Some(dbName))
case _ =>
throw new AnalysisException(
s"$quoted is not a valid FunctionIdentifier as it has more than 2 name parts.")
}
}

implicit class MultipartIdentifierHelper(parts: Seq[String]) {
Expand Down
Expand Up @@ -19,7 +19,7 @@ package org.apache.spark.sql.connector.catalog

import org.apache.spark.internal.Logging
import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.catalyst.TableIdentifier
import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
import org.apache.spark.sql.internal.{SQLConf, StaticSQLConf}

/**
Expand Down Expand Up @@ -155,4 +155,31 @@ private[sql] trait LookupCatalog extends Logging {
None
}
}

// TODO: move function related v2 statements to the new framework.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@imback82 do you have time to work on this TODO?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, I will get to it once this PR is merged.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Created #29198

def parseSessionCatalogFunctionIdentifier(
nameParts: Seq[String],
sql: String): FunctionIdentifier = {
if (nameParts.length == 1 && catalogManager.v1SessionCatalog.isTempFunction(nameParts.head)) {
return FunctionIdentifier(nameParts.head)
}

nameParts match {
case SessionCatalogAndIdentifier(_, ident) =>
if (nameParts.length == 1) {
// If there is only one name part, it means the current catalog is the session catalog.
// Here we don't fill the default database, to keep the error message unchanged for
// v1 commands.
FunctionIdentifier(nameParts.head, None)
} else {
ident.namespace match {
case Array(db) => FunctionIdentifier(ident.name, Some(db))
case _ =>
throw new AnalysisException(s"Unsupported function name '$ident'")
}
}

case _ => throw new AnalysisException(s"$sql is only supported in v1 catalog")
}
}
}
Expand Up @@ -20,7 +20,7 @@ package org.apache.spark.sql.catalyst.parser
import java.util.Locale

import org.apache.spark.sql.AnalysisException
import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, GlobalTempView, LocalTempView, PersistedView, UnresolvedAttribute, UnresolvedNamespace, UnresolvedRelation, UnresolvedStar, UnresolvedTable, UnresolvedTableOrView}
import org.apache.spark.sql.catalyst.analysis.{AnalysisTest, GlobalTempView, LocalTempView, PersistedView, UnresolvedAttribute, UnresolvedFunc, UnresolvedNamespace, UnresolvedRelation, UnresolvedStar, UnresolvedTable, UnresolvedTableOrView}
import org.apache.spark.sql.catalyst.catalog.{ArchiveResource, BucketSpec, FileResource, FunctionResource, FunctionResourceType, JarResource}
import org.apache.spark.sql.catalyst.expressions.{EqualTo, Literal}
import org.apache.spark.sql.catalyst.plans.logical._
Expand Down Expand Up @@ -2109,6 +2109,15 @@ class DDLParserSuite extends AnalysisTest {
"Operation not allowed: CREATE FUNCTION with resource type 'other'")
}

test("REFRESH FUNCTION") {
parseCompare("REFRESH FUNCTION c",
RefreshFunction(UnresolvedFunc(Seq("c"))))
parseCompare("REFRESH FUNCTION b.c",
RefreshFunction(UnresolvedFunc(Seq("b", "c"))))
parseCompare("REFRESH FUNCTION a.b.c",
RefreshFunction(UnresolvedFunc(Seq("a", "b", "c"))))
}

private case class TableSpec(
name: Seq[String],
schema: Option[StructType],
Expand Down
Expand Up @@ -611,33 +611,11 @@ class ResolveSessionCatalog(
CreateFunctionCommand(database, function, className, resources, isTemp, ignoreIfExists,
replace)
}
}

// TODO: move function related v2 statements to the new framework.
private def parseSessionCatalogFunctionIdentifier(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this method to LookupCatalog.CatalogAndFunctionIdentifier and drop the sql param.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR needs the change?

nameParts: Seq[String],
sql: String): FunctionIdentifier = {
if (nameParts.length == 1 && isTempFunction(nameParts.head)) {
return FunctionIdentifier(nameParts.head)
}

nameParts match {
case SessionCatalogAndIdentifier(_, ident) =>
if (nameParts.length == 1) {
// If there is only one name part, it means the current catalog is the session catalog.
// Here we don't fill the default database, to keep the error message unchanged for
// v1 commands.
FunctionIdentifier(nameParts.head, None)
} else {
ident.namespace match {
case Array(db) => FunctionIdentifier(ident.name, Some(db))
case _ =>
throw new AnalysisException(s"Unsupported function name '$ident'")
}
}

case _ => throw new AnalysisException(s"$sql is only supported in v1 catalog")
}
case RefreshFunction(ResolvedFunc(identifier)) =>
// Fallback to v1 command
val funcIdentifier = identifier.asFunctionIdentifier
RefreshFunctionCommand(funcIdentifier.database, funcIdentifier.funcName)
}

private def parseV1Table(tableName: Seq[String], sql: String): Seq[String] = tableName match {
Expand Down
Expand Up @@ -236,6 +236,45 @@ case class ShowFunctionsCommand(
}
}


/**
* A command for users to refresh the persistent function.
* The syntax of using this command in SQL is:
* {{{
* REFRESH FUNCTION functionName
* }}}
*/
case class RefreshFunctionCommand(
databaseName: Option[String],
functionName: String)
extends RunnableCommand {

override def run(sparkSession: SparkSession): Seq[Row] = {
val catalog = sparkSession.sessionState.catalog
if (FunctionRegistry.builtin.functionExists(FunctionIdentifier(functionName))) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We still can create persistent function with the same name as the built-in function. For example,

CREATE FUNCTION rand AS 'org.apache.spark.sql.catalyst.expressions.Abs'
DESC function default.rand

I think we should still allow this case.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems no meaning to refresh a persistent function whose name is same as a built-in function.

Yes, we can create a persistent function with the same name as the built-in function, but just create in metastore. The actual function we used is the built-in function. The reason is built-in functions are pre-cached in registry and we lookup cached function first.

e.g., CREATE FUNCTION rand AS 'xxx', DESC FUNCTION rand will always return Class: org.apache.spark.sql.catalyst.expressions.Rand.

BTW, maybe it's the reason why we create function and load it lazy that just be a Hive client, otherwise we can't create such function like rand,md5 in metastore. @cloud-fan

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how about

CREATE FUNCTION rand AS 'xxx';
DESC FUNCTION default.rand;

I think this is similar to table and temp views. Spark will try to look up temp view first, so if the name conflicts, temp view is preferred. But users can still use a qualified table name to read the table explicitly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right.

Missed qualified name case, I will fix this in followup.

throw new AnalysisException(s"Cannot refresh builtin function $functionName")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: built-in

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get it.

}
if (catalog.isTemporaryFunction(FunctionIdentifier(functionName, databaseName))) {
throw new AnalysisException(s"Cannot refresh temporary function $functionName")
}

val identifier = FunctionIdentifier(
functionName, Some(databaseName.getOrElse(catalog.getCurrentDatabase)))
// we only refresh the permanent function.
if (catalog.isPersistentFunction(identifier)) {
// register overwrite function.
val func = catalog.getFunctionMetadata(identifier)
catalog.registerFunction(func, true)
cloud-fan marked this conversation as resolved.
Show resolved Hide resolved
cloud-fan marked this conversation as resolved.
Show resolved Hide resolved
} else {
// clear cached function and throw exception
catalog.unregisterFunction(identifier)
cloud-fan marked this conversation as resolved.
Show resolved Hide resolved
throw new NoSuchFunctionException(identifier.database.get, identifier.funcName)
}

Seq.empty[Row]
}
}

object FunctionsCommand {
// operators that do not have corresponding functions.
// They should be handled `DescribeFunctionCommand`, `ShowFunctionsCommand`
Expand Down