Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-20033][SQL] support hive permanent function #17362

Closed
wants to merge 4 commits into from

Conversation

cenyuhai
Copy link
Contributor

What changes were proposed in this pull request?

support hive permanent function

How was this patch tested?

@SparkQA
Copy link

SparkQA commented Mar 20, 2017

Test build #74885 has finished for PR 17362 at commit 703d23e.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Mar 20, 2017

Test build #74889 has finished for PR 17362 at commit a668bb5.

  • This patch fails to build.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Mar 20, 2017

Test build #74890 has finished for PR 17362 at commit 9325a7c.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Mar 21, 2017

Test build #74912 has finished for PR 17362 at commit 92c077d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@gatorsmile
Copy link
Member

I think we already support Hive UDF. See the test case for permanent Hive UDF.

functionResourceLoader.loadResource(resource)
} else {
val sessionState = SessionState.get()
val localPath = sessionState.add_resource(resourceType, resource.uri)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you trying to create a Hive UDF via Spark, then call it through Hive?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no, I just use sessionState.add_resource to download the resouces

@cenyuhai
Copy link
Contributor Author

@gatorsmile hi,spark just suport HIVE UDF which resources is in the local disk, or spark-sql --jars xxx.jar or something else. But I think spark don't support the hive permanent function which resources are in hdfs.
CREATE FUNCTION hdfs_udf AS 'xxx.udf' USING JAR 'hdfs:///user/udf/.jar';
My pr just download the hdfs resources and add to the classpath.

@wangyum
Copy link
Member

wangyum commented Mar 24, 2017

@weiqingy is doing Allow adding jars from hdfs.

@cenyuhai
Copy link
Contributor Author

Ok, I think @weiqingy 's pr will resolve this problem

@gatorsmile
Copy link
Member

@wangyum Thank you!

@cenyuhai Maybe you can close this PR now. Please also try the PR #17342 and see whether it resolves your issue. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants