Skip to content

[SPARK-15665] [CORE] spark-submit --kill and --status are not working#13407

Closed
devaraj-kavali wants to merge 2 commits intoapache:masterfrom
devaraj-kavali:SPARK-15665
Closed

[SPARK-15665] [CORE] spark-submit --kill and --status are not working#13407
devaraj-kavali wants to merge 2 commits intoapache:masterfrom
devaraj-kavali:SPARK-15665

Conversation

@devaraj-kavali
Copy link

@devaraj-kavali devaraj-kavali commented May 31, 2016

What changes were proposed in this pull request?

--kill and --status were not considered while handling in OptionParser and due to that it was failing. Now handling the --kill and --status options as part of OptionParser.handle.

How was this patch tested?

Added a test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus() and also I have verified these manually by running --kill and --status commands.

@srowen
Copy link
Member

srowen commented May 31, 2016

CC @vanzin

@srowen
Copy link
Member

srowen commented May 31, 2016

Jenkins test this please

@SparkQA
Copy link

SparkQA commented May 31, 2016

Test build #59647 has finished for PR 13407 at commit 62ebc7a.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented May 31, 2016

LGTM but can you add unit tests so that these aren't inadvertently broken again?

@vanzin
Copy link
Contributor

vanzin commented May 31, 2016

Actually, looking at it again, I think the fix makes the code a little confusing. Why is appResource being used to store the argument to an option? I know it all aligns in the end, but it looks weird, and it might be possible to break this.

I think it would be better to keep another flag saying whether appResource is required or not, and only checking that (around L159) depending on the value of the flag. Then the existing cases (no args and help) could be changed to set that flag too.

@andrewor14
Copy link
Contributor

or just add a special appResource like PYSPARK_SHELL or SPARK_SHELL

@vanzin
Copy link
Contributor

vanzin commented May 31, 2016

or just add a special appResource like PYSPARK_SHELL or SPARK_SHELL

If SparkSubmit can still process --kill and --status with those, then that's fine too (just use SparkLauncher.NO_RESOURCE).

@vanzin
Copy link
Contributor

vanzin commented Jun 1, 2016

@devaraj-kavali did you have a chance to look at the feedback?

@devaraj-kavali
Copy link
Author

Thanks @vanzin and @andrewor14 for looking into this, sorry for the delay.

If SparkSubmit can still process --kill and --status with those, then that's fine too (just use SparkLauncher.NO_RESOURCE).

I tried this but it doesn't work with the below error

[devaraj@server2 spark-master]$ ./bin/spark-submit --kill driver-20160531171222-0000
Error: Cannot load main class from JAR spark-internal with URI null. Please specify a class through --class.
Run with --help for usage help or --verbose for debug output

I have renamed the printInfo flag to isAppResourceReq and used the same for kill and status cases also.

Please review and let me know your feedback.

return newCommandBuilder(args).buildCommand(env);
}

private void testCLIOpts(String opt) throws Exception {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: the naming sounds generic but this is testing options that take values, not any option... maybe tweak the name a little bit, or pass the option value as an argument (null == no value)?

@vanzin
Copy link
Contributor

vanzin commented Jun 2, 2016

A couple of nits, but looks ok pending tests, which I have no idea why they haven't run. retest this please

@SparkQA
Copy link

SparkQA commented Jun 2, 2016

Test build #59862 has finished for PR 13407 at commit 30dfb0f.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Jun 2, 2016

I think these tests are flaky, but... retest this please

@SparkQA
Copy link

SparkQA commented Jun 2, 2016

Test build #59875 has finished for PR 13407 at commit 30dfb0f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@vanzin
Copy link
Contributor

vanzin commented Jun 3, 2016

Merging to master / 2.0. If the names become an issue we can change later.

@asfgit asfgit closed this in efd3b11 Jun 3, 2016
asfgit pushed a commit that referenced this pull request Jun 3, 2016
## What changes were proposed in this pull request?
--kill and --status were not considered while handling in OptionParser and due to that it was failing. Now handling the --kill and --status options as part of OptionParser.handle.

## How was this patch tested?
Added a test org.apache.spark.launcher.SparkSubmitCommandBuilderSuite.testCliKillAndStatus() and also I have verified these manually by running --kill and --status commands.

Author: Devaraj K <devaraj@apache.org>

Closes #13407 from devaraj-kavali/SPARK-15665.

(cherry picked from commit efd3b11)
Signed-off-by: Marcelo Vanzin <vanzin@cloudera.com>
@devaraj-kavali
Copy link
Author

Thanks @vanzin for review and merging.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants