Skip to content

Commit

Permalink
Add SPARK_DIST_CLASSPATH to LAUNCH_CLASSPATH
Browse files Browse the repository at this point in the history
## What changes were proposed in this pull request?
In Databricks, `SPARK_DIST_CLASSPATH` are used for driver classpath and `SPARK_JARS_DIR` is empty. So, we need to add `SPARK_DIST_CLASSPATH` to the `LAUNCH_CLASSPATH`. We cannot remove `SPARK_JARS_DIR` because Spark unit tests are actually using it.

Author: Yin Huai <yhuai@databricks.com>

Closes apache#50 from yhuai/Add-SPARK_DIST_CLASSPATH-toLAUNCH_CLASSPATH.
  • Loading branch information
yhuai committed Aug 9, 2016
1 parent 29a1a05 commit 224fffe
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion bin/spark-class
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ if [ ! -d "$SPARK_JARS_DIR" ] && [ -z "$SPARK_TESTING$SPARK_SQL_TESTING" ]; then
echo "You need to build Spark with the target \"package\" before running this program." 1>&2
exit 1
else
LAUNCH_CLASSPATH="$SPARK_JARS_DIR/*"
LAUNCH_CLASSPATH="$SPARK_JARS_DIR/*:$SPARK_DIST_CLASSPATH"
fi

# Add the launcher build dir to the classpath if requested.
Expand Down

0 comments on commit 224fffe

Please sign in to comment.