Skip to content

add spark.driver.memory to config docs#2410

Closed
nartz wants to merge 2 commits intoapache:masterfrom
nartz:docs/add-spark-driver-memory-to-config-docs
Closed

add spark.driver.memory to config docs#2410
nartz wants to merge 2 commits intoapache:masterfrom
nartz:docs/add-spark-driver-memory-to-config-docs

Conversation

@nartz
Copy link
Contributor

@nartz nartz commented Sep 16, 2014

It took me a minute to track this down, so I thought it could be useful to have it in the docs.

I'm unsure if 512mb is the default for spark.driver.memory? Also - there could be a better value for the 'description' to differentiate it from spark.executor.memory.

@SparkQA
Copy link

SparkQA commented Sep 16, 2014

Can one of the admins verify this patch?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you call this SparkContext to be consistent with other places in the docs?

@andrewor14
Copy link
Contributor

LGTM. I'm surprised this isn't documented.

@ash211
Copy link
Contributor

ash211 commented Sep 17, 2014

One note about this setting is that I'm not sure it works in all settings -- if you pass it to a driver as a parameter then it's too late to take effect (the JVM has already started).

./bin/spark-shell --driver-java-options "-Dspark.driver.memory=1576m" doesn't actually change driver memory.

Maybe the docs should mention where it does and doesn't work?

@nartz
Copy link
Contributor Author

nartz commented Sep 17, 2014

using ./bin/spark-submit --driver-memory 3g myscript.py on the command line works for me

@andrewor14
Copy link
Contributor

@ash211 I believe it'll work actually, because we treat --driver-* arguments specially in bash before we launch the driver JVM.

@ash211
Copy link
Contributor

ash211 commented Sep 17, 2014

@andrewor14 the --driver-memory parameter works but --driver-java-options "-Dspark.driver.memory=3g" doesn't. Do you expect both to work?

@andrewor14
Copy link
Contributor

Ah sorry you're right, it will work only if it's --driver-java-options "-Xms3g -Xmx3g", because those will be passed directly to the java command when launching the JVM. Though in general --driver-java-options (and likewise spark.driver.extraJavaOptions) should not take in Spark properties, so I'm not sure if that's worth a mention in the docs.

@andrewor14
Copy link
Contributor

Hey @nartz, can you also add spark.driver.extraClassPath, spark.driver.extraLibraryPath and spark.driver.extraJavaOptions? These were originally intended for cluster mode, but very recently we added support for these in client mode as well. That's probably why they weren't documented.

@nartz
Copy link
Contributor Author

nartz commented Oct 8, 2014

i can, but it seems maybe the scope of this pull-request is too small, as it needs to encapsulate and explain a bunch of new parameters popping up, and nuances about when to use spark.driver.memory (as i see many people having problems on the mailing list) - maybe we should just turn it into an issue that can be more comprehensive?

@andrewor14
Copy link
Contributor

Ok, fair enough. Though none of these configs are actually new; they have been around since 1.0 but just never documented. I can merge this and file a separate JIRA for these other configs. Thanks @nartz.

@andrewor14
Copy link
Contributor

Also could you add [Docs] to the title? It will help us organize our PRs.

@andrewor14
Copy link
Contributor

Ok I merged this into master.

@asfgit asfgit closed this in 13cab5b Oct 9, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants

Comments