New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SPARK-1576 (Allow JAVA_OPTS to be passed as a command line parameter to YARN client) #492
Conversation
Can one of the admins verify this patch? |
see the comment on the jira. Also things should generally flow through the spark-submit script now. |
I think this is (now) already available as "--driver-java-options" (works both for yarn-client and yarn-cluster modes). |
--driver-java-options would only work through the spark-submit script. If the developer invokes yarn.deploy.Client directly (as is common practice thusfar), they would simply pass java options (using -D) and the following code would allow spark options to be accepted: for ((k, v) <- sys.props.filterKeys(_.startsWith("spark"))) This too was checked in by Patrick fairly recently (last Monday I think). I would prefer creation of an explicit flag (--spark-java-options or --driver-java-options) for yarn.deploy.Client in order to be consistent with the rest of Spark UI, buthen Patrick has already checked in the code above and I guess he intends on keeping it. |
@nishkamravi2 hey, the older yarn client is being deprecated so we probably won't add new features to it. That is, if users want to take advantage of new options they will need to switch over to Is what you are trying to do not supported by spark-submit? If so, mind elaborating a bit more (because setting java options for the driver and executors are both supported)? |
Correct, and as Patrick mentions, it should be the way to submit spark jobs Marcelo |
Spark-submit looks neat and it does make sense to encourage use of a single primary interface for Spark invocation. Whether or not there is value in cleaning up the secondary interfaces may be subject to debate. However, if we plan on deprecating (and potentially retiring) the secondary interfaces, then I can see your point-of-view. |
fixed job name and usage information for the JavaSparkPi example
@nishkamravi2 have you tried the spark-submit script and can we close this? |
@tgravescs Sorry, missed this one somehow. Yes, spark-submit is now fully tested. We could keep this one open for 0.92 potentially. |
can we close this now? We have a 1.x branch now with spark-submit and will have a 1.1 release relatively soon. If someone really needs this in the 0.9.x line they can reopen. |
Ok, closed. |
fixed job name and usage information for the JavaSparkPi example (cherry picked from commit a1238bb) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
Testing whether circle CI starts...
After theopenlab/openlab#202 merged, we can change this to 'https'. Related-Bug: theopenlab/openlab#202
KE-35596 Update jackson & tomcat version for vulnerability issue
No description provided.