New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-20033][SQL] support hive permanent function #17362
Conversation
Test build #74885 has finished for PR 17362 at commit
|
Test build #74889 has finished for PR 17362 at commit
|
Test build #74890 has finished for PR 17362 at commit
|
Test build #74912 has finished for PR 17362 at commit
|
I think we already support Hive UDF. See the test case for permanent Hive UDF. |
functionResourceLoader.loadResource(resource) | ||
} else { | ||
val sessionState = SessionState.get() | ||
val localPath = sessionState.add_resource(resourceType, resource.uri) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are you trying to create a Hive UDF via Spark, then call it through Hive?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, I just use sessionState.add_resource to download the resouces
@gatorsmile hi,spark just suport HIVE UDF which resources is in the local disk, or spark-sql --jars xxx.jar or something else. But I think spark don't support the hive permanent function which resources are in hdfs. |
@weiqingy is doing Allow adding jars from hdfs. |
Ok, I think @weiqingy 's pr will resolve this problem |
What changes were proposed in this pull request?
support hive permanent function
How was this patch tested?