-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-9180] fix spark-shell to accept --name option #7512
Conversation
According to |
@srowen I think |
Yes but https://github.com/apache/spark/blob/v1.4.1/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L434 suggests that |
This is a different thing. This As |
@@ -1008,7 +1008,7 @@ class SparkILoop( | |||
val jars = SparkILoop.getAddedJars | |||
val conf = new SparkConf() | |||
.setMaster(getMaster()) | |||
.setAppName("Spark shell") | |||
.setAppName(sys.props.getOrElse("spark.app.name", "Spark shell")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You probably could avoid all changes to the shell scripts and just say:
.setIfMissing("spark.app.name", "Spark shell")
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@vanzin If --name
is not specified on the spark-shell command line, the main class name (org.apache.spark.repl.Main
) is set for spark.app.name
, so spark.app.name
will not become missing when executed via spark-shell
command line:
The reason why I use getOrElse
here is just for backward compatibility for special use cases (e.g., directly executing Main class of spark-shell instead of using spark-shell command line; in this case spark.app.name
may not be set)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, but you could still use setIfMissing
instead of what you have here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK fixed as suggested.
ok to test. |
LGTM pending tests (which shouldn't have a problem passing). |
Test build #37972 has finished for PR 7512 at commit
|
Jenkins, retest this please. |
Test build #45 has finished for PR 7512 at commit
|
Test build #37995 has finished for PR 7512 at commit
|
I think, these failures are caused by flaky tests. |
retest this please. |
Test build #38007 timed out for PR 7512 at commit |
Wow builds are really flaky. Jenkins retest this please. |
retest this please. |
Test build #38103 has finished for PR 7512 at commit
|
@JoshRosen any idea why this test keeps failing? Doesn't seem in any way related to the change here. Wonder if other PRs are also flaky. |
Jenkins retest this please. |
Test build #64 has finished for PR 7512 at commit
|
LGTM. @vanzin I also think this is ready to merge. |
Merged into master. Thanks! |
Test build #38110 has finished for PR 7512 at commit
|
This is continuation of #7512 which added `--name` option to spark-shell. This PR adds the same option to pyspark. Note that `--conf spark.app.name` in command-line has no effect in spark-shell and pyspark. Instead, `--name` must be used. This is in fact inconsistency with spark-sql which doesn't accept `--name` option while it accepts `--conf spark.app.name`. I am not fixing this inconsistency in this PR. IMO, one of `--name` and `--conf spark.app.name` is needed not both. But since I cannot decide which to choose, I am not making any change here. Author: Cheolsoo Park <cheolsoop@netflix.com> Closes #7610 from piaozhexiu/SPARK-9270 and squashes the following commits: 763e86d [Cheolsoo Park] Update windows script 400b7f9 [Cheolsoo Park] Allow --name option to pyspark
This patch fixes [SPARK-9180].
Users can now set the app name of spark-shell using
spark-shell --name "whatever"
.