Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-20204][SQL][Followup] SQLConf should react to change in default timezone settings #17537

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -147,6 +147,14 @@ private[spark] class TypedConfigBuilder[T](
}
}

/** Creates a [[ConfigEntry]] with a function has a default value */
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Creates a [[ConfigEntry]] with a function has a default value
->
Creates a [[ConfigEntry]] with a function to determine the default value

def createWithDefaultFunction(defaultFunc: () => T): ConfigEntry[T] = {
val entry = new ConfigEntryWithDefaultFunction[T](parent.key, defaultFunc, converter,
stringConverter, parent._doc, parent._public)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit

val entry = new XXX(
  param1, param2, ...)

parent._onCreate.foreach(_ (entry))
entry
}

/**
* Creates a [[ConfigEntry]] that has a default value. The default value is provided as a
* [[String]] and must be a valid value for the entry.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,24 @@ private class ConfigEntryWithDefault[T] (
def readFrom(reader: ConfigReader): T = {
reader.get(key).map(valueConverter).getOrElse(_defaultValue)
}
}

private class ConfigEntryWithDefaultFunction[T] (
key: String,
_defaultFunction: () => T,
valueConverter: String => T,
stringConverter: T => String,
doc: String,
isPublic: Boolean)
extends ConfigEntry(key, valueConverter, stringConverter, doc, isPublic) {

override def defaultValue: Option[T] = Some(_defaultFunction())

override def defaultValueString: String = stringConverter(_defaultFunction())

def readFrom(reader: ConfigReader): T = {
reader.get(key).map(valueConverter).getOrElse(_defaultFunction())
}
}

private class ConfigEntryWithDefaultString[T] (
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@

package org.apache.spark.internal.config

import java.util.TimeZone
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove it?

import java.util.concurrent.TimeUnit

import org.apache.spark.{SparkConf, SparkFunSuite}
Expand Down Expand Up @@ -51,6 +52,26 @@ class ConfigEntrySuite extends SparkFunSuite {
assert(conf.get(dConf) === 20.0)
}

test("conf entry: timezone") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we are testing ConfigEntryWithDefaultFunction, not timezone

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

also move this test to the end, it's weird to put this test in the middle of different config types tests.

val tzStart = TimeZone.getDefault().getID()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test can be as simple as:

var data = 0
val conf = new SparkConf()
val iConf = ConfigBuilder(testKey("int")).intConf.createWithDefaultFunction(() => data)
assert(conf.get(iConf) === 0)
data = 2
assert(conf.get(iConf) === 2)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cloud-fan Thank you :-). Its so much simpler. I somehow was trying to use the exact same function that we use in the code.

val conf = new SparkConf()
val dConf = ConfigBuilder(testKey("tz"))
.stringConf
.createWithDefaultFunction(() => TimeZone.getDefault().getID())

val tzConf = conf.get(dConf)
assert(tzStart === tzConf)

// Pick a timezone which is not the current timezone
val availableTzs: Seq[String] = TimeZone.getAvailableIDs();
val newTz = availableTzs.find(_ != tzStart).getOrElse(tzStart)
TimeZone.setDefault(TimeZone.getTimeZone(newTz))

val tzChanged = conf.get(dConf)
assert(tzChanged === newTz)
TimeZone.setDefault(TimeZone.getTimeZone(tzStart))
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: two more extra spaces

test("conf entry: boolean") {
val conf = new SparkConf()
val bConf = ConfigBuilder(testKey("boolean")).booleanConf.createWithDefault(false)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -752,7 +752,7 @@ object SQLConf {
buildConf("spark.sql.session.timeZone")
.doc("""The ID of session local timezone, e.g. "GMT", "America/Los_Angeles", etc.""")
.stringConf
.createWithDefault(TimeZone.getDefault().getID())
.createWithDefaultFunction(() => TimeZone.getDefault().getID())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TimeZone.getDefault().getID()
->
TimeZone.getDefault.getID


val WINDOW_EXEC_BUFFER_SPILL_THRESHOLD =
buildConf("spark.sql.windowExec.buffer.spill.threshold")
Expand Down