Skip to content

[SPARK-14424][BUILD][DOCS] Update the build docs to switch from assembly to package and add a no…#12197

Closed
holdenk wants to merge 2 commits intoapache:masterfrom
holdenk:SPARK-1424-spark-class-broken-fix-build-docs
Closed

[SPARK-14424][BUILD][DOCS] Update the build docs to switch from assembly to package and add a no…#12197
holdenk wants to merge 2 commits intoapache:masterfrom
holdenk:SPARK-1424-spark-class-broken-fix-build-docs

Conversation

@holdenk
Copy link
Contributor

@holdenk holdenk commented Apr 6, 2016

What changes were proposed in this pull request?

Change our build docs & shell scripts to that developers are aware of the change from "assembly" to "package"

How was this patch tested?

Manually ran ./bin/spark-shell after ./build/sbt assembly and verified error message printed, ran new suggested build target and verified ./bin/spark-shell runs after this.

…te in spark-class if we can't find the required target
@SparkQA
Copy link

SparkQA commented Apr 6, 2016

Test build #55088 has finished for PR 12197 at commit cb050a0.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

bin/spark-class Outdated
if [ ! -d "$SPARK_JARS_DIR" ] && [ -z "$SPARK_TESTING$SPARK_SQL_TESTING" ]; then
echo "Failed to find Spark jars directory ($SPARK_JARS_DIR)." 1>&2
echo "You need to build Spark before running this program." 1>&2
echo "Note: In Spark 2.0 the required build target has changed from \"assembly\" to \"package\"" 1>&2
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks OK though I suppose this message can be shortened to just directly state that you need to "build Spark with target 'package' before running this program". Looks like mvn package still works so that's all fine.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, "assembly" was never valid in maven.

@SparkQA
Copy link

SparkQA commented Apr 6, 2016

Test build #55128 has finished for PR 12197 at commit 7ace69c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

PySpark on YARN is only supported if the jar is built with Maven. Further, there is a known problem
with building this assembly jar on Red Hat based operating systems (see [SPARK-1753](https://issues.apache.org/jira/browse/SPARK-1753)). If you wish to
run PySpark on a YARN cluster with Red Hat installed, we recommend that you build the jar elsewhere,
then ship it over to the cluster. We are investigating the exact cause for this.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hah, I guess the investigation never happened.

@andrewor14
Copy link
Contributor

LGTM merging into master.

@asfgit asfgit closed this in 457e58b Apr 6, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants