-
Notifications
You must be signed in to change notification settings - Fork 28k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-26489][CORE] Use ConfigEntry for hardcoded configs for python/r categories #23428
Conversation
Test build #100640 has finished for PR 23428 at commit
|
@@ -733,4 +733,45 @@ package object config { | |||
.stringConf | |||
.toSequence | |||
.createWithDefault(Nil) | |||
|
|||
private[spark] val PYTHON_WORKER_REUSE = ConfigBuilder("spark.python.worker.reuse") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd rather have separate files for these (e.g. Python.scala
, R.scala
) to avoid polluting this object. See other examples in the config package.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the nice suggestion. I was thinking config package object is already all-in-one (except History
, Kafka
, Status
) so adding these config into here would be OK and we may have chance to sort out all the things, but I agree we can do this earlier for clear cases. Will address.
Test build #100670 has finished for PR 23428 at commit
|
Test build #100672 has finished for PR 23428 at commit
|
85ea372
to
d0a345d
Compare
Test build #100675 has finished for PR 23428 at commit
|
retest this please |
@@ -47,10 +48,8 @@ private[spark] class RBackend { | |||
|
|||
def init(): (Int, RAuthHelper) = { | |||
val conf = new SparkConf() | |||
val backendConnectionTimeout = conf.getInt( | |||
"spark.r.backendConnectionTimeout", SparkRDefaults.DEFAULT_CONNECTION_TIMEOUT) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the whole file SparkRDefaults.scala
looks not being referred anymore. Can we delete this file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice finding! Will remove.
@@ -733,4 +729,5 @@ package object config { | |||
.stringConf | |||
.toSequence | |||
.createWithDefault(Nil) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not a big deal at all but I'd remove this line.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK will address.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, shouldn't we move other Python/R configurations into Python.scala and R.scala? For instance, I'm seeing spark.pyspark.driver.python
and spark.executor.pyspark.memory
Test build #100677 has finished for PR 23428 at commit
|
Test build #100691 has finished for PR 23428 at commit
|
Looks good. Merging to master. |
Thanks @vanzin and @HyukjinKwon for reviewing and merging! |
LGTM too |
…r categories ## What changes were proposed in this pull request? The PR makes hardcoded configs below to use ConfigEntry. * spark.pyspark * spark.python * spark.r This patch doesn't change configs which are not relevant to SparkConf (e.g. system properties, python source code) ## How was this patch tested? Existing tests. Closes apache#23428 from HeartSaVioR/SPARK-26489. Authored-by: Jungtaek Lim (HeartSaVioR) <kabhwan@gmail.com> Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
What changes were proposed in this pull request?
The PR makes hardcoded configs below to use ConfigEntry.
This patch doesn't change configs which are not relevant to SparkConf (e.g. system properties, python source code)
How was this patch tested?
Existing tests.