add spark.driver.memory to config docs#2410
Conversation
|
Can one of the admins verify this patch? |
docs/configuration.md
Outdated
There was a problem hiding this comment.
Can you call this SparkContext to be consistent with other places in the docs?
|
LGTM. I'm surprised this isn't documented. |
|
One note about this setting is that I'm not sure it works in all settings -- if you pass it to a driver as a parameter then it's too late to take effect (the JVM has already started).
Maybe the docs should mention where it does and doesn't work? |
|
using |
|
@ash211 I believe it'll work actually, because we treat |
|
@andrewor14 the |
|
Ah sorry you're right, it will work only if it's |
|
Hey @nartz, can you also add |
|
i can, but it seems maybe the scope of this pull-request is too small, as it needs to encapsulate and explain a bunch of new parameters popping up, and nuances about when to use spark.driver.memory (as i see many people having problems on the mailing list) - maybe we should just turn it into an issue that can be more comprehensive? |
|
Ok, fair enough. Though none of these configs are actually new; they have been around since 1.0 but just never documented. I can merge this and file a separate JIRA for these other configs. Thanks @nartz. |
|
Also could you add |
|
Ok I merged this into master. |
It took me a minute to track this down, so I thought it could be useful to have it in the docs.
I'm unsure if 512mb is the default for spark.driver.memory? Also - there could be a better value for the 'description' to differentiate it from spark.executor.memory.