From bb12b950c149e8ebeb78b047b9bfc37a4313eb76 Mon Sep 17 00:00:00 2001 From: CrazyJvm Date: Wed, 30 Jul 2014 09:45:31 +0800 Subject: [PATCH] automatically set master according to `spark.master` in `spark-defaults.conf` --- docs/spark-standalone.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index ad8b6c0e51a78..afd6f240931e1 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -242,8 +242,8 @@ To run an interactive Spark shell against the cluster, run the following command ./bin/spark-shell --master spark://IP:PORT -Note that if you are running spark-shell from one of the spark cluster machines, the `bin/spark-shell` script will -automatically set MASTER from the `SPARK_MASTER_IP` and `SPARK_MASTER_PORT` variables in `conf/spark-env.sh`. +Note that if you are running spark-shell from one of the spark cluster machines, the `bin/spark-shell` script +will automatically set master according to `spark.master` in `spark-defaults.conf`. You can also pass an option `--cores ` to control the number of cores that spark-shell uses on the cluster.