Skip to content

Conversation

@adrian-wang
Copy link
Contributor

What changes were proposed in this pull request?

This is to create a custom context for command bin/spark-sql and sbin/start-thriftserver. Any context that is derived from HiveContext is acceptable. User need to configure the class name of custom context in a config of spark.sql.context.class, and make sure the class in classpath. This is to provide a more elegant way for custom configurations and changes for infrastructure team.

How was this patch tested?

Added a unit test in CliSuite to track this.

cc @chenghao-intel

@SparkQA
Copy link

SparkQA commented Mar 19, 2016

Test build #53613 has finished for PR 11843 at commit 654397e.

  • This patch fails Scala style tests.
  • This patch does not merge cleanly.
  • This patch adds the following public classes (experimental):
    • log.warn(s\"Configured context class $className is not a subclass ofHiveContext,\" +

@SparkQA
Copy link

SparkQA commented Mar 19, 2016

Test build #53614 has finished for PR 11843 at commit 864e1b6.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • log.warn(s\"Configured context class $className is not a subclass ofHiveContext,\" +


private def readContextClassFromConf(sparkConf: SparkConf): Class[_ <: HiveContext] = {
val className =
sparkConf.get("spark.sql.context.class", "org.apache.spark.sql.hive.HiveContext")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instead of hard code, can we make it as classOf[HiveContext].getName()?

@SparkQA
Copy link

SparkQA commented Mar 24, 2016

Test build #54020 has finished for PR 11843 at commit 35c7a8f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@adrian-wang adrian-wang changed the title [SPARK-14021][SQL][WIP] custom context support for SparkSQLEnv [SPARK-14021][SQL] custom context support for SparkSQLEnv Mar 24, 2016
@chenghao-intel
Copy link
Contributor

cc @rxin @liancheng

@chenghao-intel
Copy link
Contributor

cc @yhuai , this is critical for our own customized HiveContext, can you please merge this?

@rxin
Copy link
Contributor

rxin commented Mar 28, 2016

HiveContext is going to be deprecated in Spark 2.0. Please see https://issues.apache.org/jira/browse/SPARK-13485

I don't think it makes sense to merge this at this point. If you need this, you should do something once SparkSession is introduced.

@andrewor14
Copy link
Contributor

Let's close this PR since HiveContext is now removed. If there is interest feel free to re-open it and make the changes based on SessionState instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants