Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-12855][MINOR][SQL][DOC][TEST] remove spark.sql.dialect from doc and test #11758

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 0 additions & 7 deletions docs/sql-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,13 +122,6 @@ Spark build. If these dependencies are not a problem for your application then u
is recommended for the 1.3 release of Spark. Future releases will focus on bringing `SQLContext` up
to feature parity with a `HiveContext`.

The specific variant of SQL that is used to parse queries can also be selected using the
`spark.sql.dialect` option. This parameter can be changed using either the `setConf` method on
a `SQLContext` or by using a `SET key=value` command in SQL. For a `SQLContext`, the only dialect
available is "sql" which uses a simple SQL parser provided by Spark SQL. In a `HiveContext`, the
default is "hiveql", though "sql" is also available. Since the HiveQL parser is much more complete,
this is recommended for most use cases.


## Creating DataFrames

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ class SQLContext private[sql](
def this(sparkContext: JavaSparkContext) = this(sparkContext.sc)

// If spark.sql.allowMultipleContexts is true, we will throw an exception if a user
// wants to create a new root SQLContext (a SLQContext that is not created by newSession).
// wants to create a new root SQLContext (a SQLContext that is not created by newSession).
private val allowMultipleContexts =
sparkContext.conf.getBoolean(
SQLConf.ALLOW_MULTIPLE_CONTEXTS.key,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -270,12 +270,6 @@ class HiveQuerySuite extends HiveComparisonTest with BeforeAndAfter {
"SELECT 11 % 10, IF((101.1 % 100.0) BETWEEN 1.01 AND 1.11, \"true\", \"false\"), " +
"(101 / 2) % 10 FROM src LIMIT 1")

test("Query expressed in SQL") {
setConf("spark.sql.dialect", "sql")
assert(sql("SELECT 1").collect() === Array(Row(1)))
setConf("spark.sql.dialect", "hiveql")
}

test("Query expressed in HiveQL") {
sql("FROM src SELECT key").collect()
}
Expand Down