Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide same info as in spark-submit --help #10890

Closed
wants to merge 3 commits into from
Closed

Provide same info as in spark-submit --help #10890

wants to merge 3 commits into from

Conversation

jimlohse
Copy link
Contributor

this is stated for --packages and --repositories. Without stating it for --jars, people expect a standard java classpath to work, with expansion and using a different delimiter than a comma. Currently this is only state in the --help for spark-submit "Comma-separated list of local jars to include on the driver and executor classpaths."

this is stated for --packages and --repositories. Without stating it for --jars, people expect a standard java classpath to work, with expansion and using a different delimiter than a comma. Currently this is only state in the --help for spark-submit "Comma-separated list of local jars to include on the driver and executor classpaths."
@jimlohse
Copy link
Contributor Author

I don't know if I have write access, about to find out, I suspect I don't.

@jimlohse jimlohse closed this Jan 24, 2016
@jimlohse jimlohse reopened this Jan 24, 2016
@@ -177,8 +177,9 @@ debugging information by running `spark-submit` with the `--verbose` option.

# Advanced Dependency Management
When using `spark-submit`, the application jar along with any jars included with the `--jars` option
will be automatically transferred to the cluster. Spark uses the following URL scheme to allow
different strategies for disseminating jars:
will be automatically transferred to the cluster. URLs supplied after --jars must be separated by commas. Each entry points to a specific jar file, resulting in a comma-separated list of local jars. That list is included on the driver and executor classpaths. Directory expansion does not work with --jars.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems OK but do they have to be local JARs (I think so)? in which case are they really URLs? The second sentence you added seems to say the same thing as the first then. You could back-tick --jars too for consistency.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I didn't catch the first point of your question til now, I think they
are URLS because I think they need the file:// before them, that's an
URL too?

On 01/24/2016 12:24 PM, Sean Owen wrote:

In docs/submitting-applications.md
#10890 (comment):

@@ -177,8 +177,9 @@ debugging information by running spark-submit with the --verbose option.

Advanced Dependency Management

When using spark-submit, the application jar along with any jars included with the --jars option
-will be automatically transferred to the cluster. Spark uses the following URL scheme to allow
-different strategies for disseminating jars:
+will be automatically transferred to the cluster. URLs supplied after --jars must be separated by commas. Each entry points to a specific jar file, resulting in a comma-separated list of local jars. That list is included on the driver and executor classpaths. Directory expansion does not work with --jars.

This seems OK but do they have to be local JARs (I think so)? in which
case are they really URLs? The second sentence you added seems to say
the same thing as the first then. You could back-tick |--jars| too for
consistency.


Reply to this email directly or view it on GitHub
https://github.com/apache/spark/pull/10890/files#r50640513.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was a dumb question since it's answered just below, yes. I didn't even realize this. They're really URIs not URLs but this is minor.

Actually, spark-submit --help refers to local JARs, but yeah they're not necessarily local? I think it'd be fine to also fix the help text on all of the spark-* scripts while we're here.

thanks for the feedback, good point. Re: verifying this, I would be interested to see what you come up with, in my implementation, sitting with my boss who has years of Java experience, he kept trying to use a standard classpath, which didn't work. We also discovered directory expansion doesn't work.
@srowen
Copy link
Member

srowen commented Jan 27, 2016

Jenkins, test this please

@srowen
Copy link
Member

srowen commented Jan 27, 2016

LGTM

@srowen
Copy link
Member

srowen commented Jan 28, 2016

Jenkins, retest this please

@SparkQA
Copy link

SparkQA commented Jan 28, 2016

Test build #50271 has finished for PR 10890 at commit 73bf7e9.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Jan 28, 2016

Merged to master

@asfgit asfgit closed this in c220443 Jan 28, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants