Skip to content

Commit

Permalink
[SPARK-12855][MINOR][SQL][DOC][TEST] remove spark.sql.dialect from do…
Browse files Browse the repository at this point in the history
…c and test

## What changes were proposed in this pull request?

Since developer API of plug-able parser has been removed in #10801 , docs should be updated accordingly.

## How was this patch tested?

This patch will not affect the real code path.

Author: Daoyuan Wang <daoyuan.wang@intel.com>

Closes #11758 from adrian-wang/spark12855.
  • Loading branch information
adrian-wang authored and rxin committed Mar 17, 2016
1 parent c890c35 commit d1c193a
Show file tree
Hide file tree
Showing 3 changed files with 1 addition and 14 deletions.
7 changes: 0 additions & 7 deletions docs/sql-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,13 +122,6 @@ Spark build. If these dependencies are not a problem for your application then u
is recommended for the 1.3 release of Spark. Future releases will focus on bringing `SQLContext` up
to feature parity with a `HiveContext`.

The specific variant of SQL that is used to parse queries can also be selected using the
`spark.sql.dialect` option. This parameter can be changed using either the `setConf` method on
a `SQLContext` or by using a `SET key=value` command in SQL. For a `SQLContext`, the only dialect
available is "sql" which uses a simple SQL parser provided by Spark SQL. In a `HiveContext`, the
default is "hiveql", though "sql" is also available. Since the HiveQL parser is much more complete,
this is recommended for most use cases.


## Creating DataFrames

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ class SQLContext private[sql](
def this(sparkContext: JavaSparkContext) = this(sparkContext.sc)

// If spark.sql.allowMultipleContexts is true, we will throw an exception if a user
// wants to create a new root SQLContext (a SLQContext that is not created by newSession).
// wants to create a new root SQLContext (a SQLContext that is not created by newSession).
private val allowMultipleContexts =
sparkContext.conf.getBoolean(
SQLConf.ALLOW_MULTIPLE_CONTEXTS.key,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -270,12 +270,6 @@ class HiveQuerySuite extends HiveComparisonTest with BeforeAndAfter {
"SELECT 11 % 10, IF((101.1 % 100.0) BETWEEN 1.01 AND 1.11, \"true\", \"false\"), " +
"(101 / 2) % 10 FROM src LIMIT 1")

test("Query expressed in SQL") {
setConf("spark.sql.dialect", "sql")
assert(sql("SELECT 1").collect() === Array(Row(1)))
setConf("spark.sql.dialect", "hiveql")
}

test("Query expressed in HiveQL") {
sql("FROM src SELECT key").collect()
}
Expand Down

0 comments on commit d1c193a

Please sign in to comment.