From f98dad249fda970679f1ef91bb4bdb399f03dd24 Mon Sep 17 00:00:00 2001 From: panbingkun Date: Sat, 11 May 2024 15:22:22 +0800 Subject: [PATCH] [SPARK-48240][DOCS] Replace `Local[..]` with `"Local[...]"` in the docs --- docs/configuration.md | 4 ++-- docs/quick-start.md | 6 +++--- docs/rdd-programming-guide.md | 12 ++++++------ docs/submitting-applications.md | 2 +- 4 files changed, 12 insertions(+), 12 deletions(-) diff --git a/docs/configuration.md b/docs/configuration.md index c018b9f1fb7c0..7884a2af60b23 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -91,7 +91,7 @@ Then, you can supply configuration values at runtime: ```sh ./bin/spark-submit \ --name "My app" \ - --master local[4] \ + --master "local[4]" \ --conf spark.eventLog.enabled=false \ --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \ myApp.jar @@ -3750,7 +3750,7 @@ Also, you can modify or add configurations at runtime: {% highlight bash %} ./bin/spark-submit \ --name "My app" \ - --master local[4] \ + --master "local[4]" \ --conf spark.eventLog.enabled=false \ --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" \ --conf spark.hadoop.abc.def=xyz \ diff --git a/docs/quick-start.md b/docs/quick-start.md index 366970cf66c71..5a03af98cd832 100644 --- a/docs/quick-start.md +++ b/docs/quick-start.md @@ -286,7 +286,7 @@ We can run this application using the `bin/spark-submit` script: {% highlight bash %} # Use spark-submit to run your application $ YOUR_SPARK_HOME/bin/spark-submit \ - --master local[4] \ + --master "local[4]" \ SimpleApp.py ... Lines with a: 46, Lines with b: 23 @@ -371,7 +371,7 @@ $ sbt package # Use spark-submit to run your application $ YOUR_SPARK_HOME/bin/spark-submit \ --class "SimpleApp" \ - --master local[4] \ + --master "local[4]" \ target/scala-{{site.SCALA_BINARY_VERSION}}/simple-project_{{site.SCALA_BINARY_VERSION}}-1.0.jar ... Lines with a: 46, Lines with b: 23 @@ -452,7 +452,7 @@ $ mvn package # Use spark-submit to run your application $ YOUR_SPARK_HOME/bin/spark-submit \ --class "SimpleApp" \ - --master local[4] \ + --master "local[4]" \ target/simple-project-1.0.jar ... Lines with a: 46, Lines with b: 23 diff --git a/docs/rdd-programming-guide.md b/docs/rdd-programming-guide.md index f75bda0ffafb0..cbbce4c082060 100644 --- a/docs/rdd-programming-guide.md +++ b/docs/rdd-programming-guide.md @@ -214,13 +214,13 @@ can be passed to the `--repositories` argument. For example, to run `bin/pyspark` on exactly four cores, use: {% highlight bash %} -$ ./bin/pyspark --master local[4] +$ ./bin/pyspark --master "local[4]" {% endhighlight %} Or, to also add `code.py` to the search path (in order to later be able to `import code`), use: {% highlight bash %} -$ ./bin/pyspark --master local[4] --py-files code.py +$ ./bin/pyspark --master "local[4]" --py-files code.py {% endhighlight %} For a complete list of options, run `pyspark --help`. Behind the scenes, @@ -260,19 +260,19 @@ can be passed to the `--repositories` argument. For example, to run `bin/spark-s four cores, use: {% highlight bash %} -$ ./bin/spark-shell --master local[4] +$ ./bin/spark-shell --master "local[4]" {% endhighlight %} Or, to also add `code.jar` to its classpath, use: {% highlight bash %} -$ ./bin/spark-shell --master local[4] --jars code.jar +$ ./bin/spark-shell --master "local[4]" --jars code.jar {% endhighlight %} To include a dependency using Maven coordinates: {% highlight bash %} -$ ./bin/spark-shell --master local[4] --packages "org.example:example:0.1" +$ ./bin/spark-shell --master "local[4]" --packages "org.example:example:0.1" {% endhighlight %} For a complete list of options, run `spark-shell --help`. Behind the scenes, @@ -781,7 +781,7 @@ One of the harder things about Spark is understanding the scope and life cycle o #### Example -Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = local[n]`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN): +Consider the naive RDD element sum below, which may behave differently depending on whether execution is happening within the same JVM. A common example of this is when running Spark in `local` mode (`--master = "local[n]"`) versus deploying a Spark application to a cluster (e.g. via spark-submit to YARN):
diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md index bf02ec137e200..3a99151768a12 100644 --- a/docs/submitting-applications.md +++ b/docs/submitting-applications.md @@ -91,7 +91,7 @@ run it with `--help`. Here are a few examples of common options: # Run application locally on 8 cores ./bin/spark-submit \ --class org.apache.spark.examples.SparkPi \ - --master local[8] \ + --master "local[8]" \ /path/to/examples.jar \ 100