Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions docs/programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ variable called `sc`. Making your own SparkContext will not work. You can set wh
context connects to using the `--master` argument, and you can add JARs to the classpath
by passing a comma-separated list to the `--jars` argument. You can also add dependencies
(e.g. Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates
to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. SonaType)
to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. Sonatype)
Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, these changes are just typo fixes I saw while looking at other references to --repositories

can be passed to the `--repositories` argument. For example, to run `bin/spark-shell` on exactly
four cores, use:

Expand Down Expand Up @@ -214,9 +214,9 @@ variable called `sc`. Making your own SparkContext will not work. You can set wh
context connects to using the `--master` argument, and you can add Python .zip, .egg or .py files
to the runtime path by passing a comma-separated list to `--py-files`. You can also add dependencies
(e.g. Spark Packages) to your shell session by supplying a comma-separated list of maven coordinates
to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. SonaType)
can be passed to the `--repositories` argument. Any python dependencies a Spark Package has (listed in
the requirements.txt of that package) must be manually installed using pip when necessary.
to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. Sonatype)
can be passed to the `--repositories` argument. Any Python dependencies a Spark package has (listed in
the requirements.txt of that package) must be manually installed using `pip` when necessary.
For example, to run `bin/pyspark` on exactly four cores, use:

{% highlight bash %}
Expand Down
2 changes: 2 additions & 0 deletions docs/submitting-applications.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,8 @@ is handled automatically, and with Spark standalone, automatic cleanup can be co
Users may also include any other dependencies by supplying a comma-delimited list of maven coordinates
with `--packages`. All transitive dependencies will be handled when using this command. Additional
repositories (or resolvers in SBT) can be added in a comma-delimited fashion with the flag `--repositories`.
(Note that credentials for password-protected repositories can be supplied in some cases in the repository URI,
such as in `https://user:password@host/...`. Be careful when supplying credentials this way.)
These commands can be used with `pyspark`, `spark-shell`, and `spark-submit` to include Spark Packages.

For Python, the equivalent `--py-files` option can be used to distribute `.egg`, `.zip` and `.py` libraries
Expand Down