Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions dev/create-release/create-release.sh
Original file line number Diff line number Diff line change
Expand Up @@ -94,13 +94,13 @@ if [[ ! "$@" =~ --package-only ]]; then
rm -rf $SPARK_REPO

mvn -DskipTests -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \
-Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
-Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl -Pexternal-projects \
clean install

./dev/change-version-to-2.11.sh

mvn -DskipTests -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \
-Dscala-2.11 -Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
-Dscala-2.11 -Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl -Pexternal-projects \
clean install

./dev/change-version-to-2.10.sh
Expand Down
3 changes: 2 additions & 1 deletion dev/run-tests
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,8 @@ function handle_error () {
fi
}

export SBT_MAVEN_PROFILES_ARGS="$SBT_MAVEN_PROFILES_ARGS -Pkinesis-asl"
# Add non-default build components
export SBT_MAVEN_PROFILES_ARGS="$SBT_MAVEN_PROFILES_ARGS -Pkinesis-asl -Pexternal-projects"

# Determine Java path and version.
{
Expand Down
6 changes: 6 additions & 0 deletions docs/building-spark.md
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,12 @@ Scala 2.11 support in Spark is experimental and does not support a few features.
Specifically, Spark's external Kafka library and JDBC component are not yet
supported in Scala 2.11 builds.

# Building External Connectors
Spark's external connectors such as Flume integration can be enabled with the `-Pexternal-projects` flag.
mvn -Pexternal-projects -DskipTests clean package

Scala 2.11 support in Spark is experimental and does not support a few features.

# Spark Tests in Maven

Tests are run by default via the [ScalaTest Maven plugin](http://www.scalatest.org/user_guide/using_the_scalatest_maven_plugin).
Expand Down
17 changes: 12 additions & 5 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -98,11 +98,6 @@
<module>sql/core</module>
<module>sql/hive</module>
<module>assembly</module>
<module>external/twitter</module>
<module>external/flume</module>
<module>external/flume-sink</module>
<module>external/mqtt</module>
<module>external/zeromq</module>
<module>examples</module>
<module>repl</module>
</modules>
Expand Down Expand Up @@ -1201,6 +1196,18 @@
</dependencies>
</profile>

<!-- External projects are not built in less this flag is enabled. -->
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in less -> unless

<profile>
<id>external-projects</id>
<modules>
<module>external/twitter</module>
<module>external/flume</module>
<module>external/flume-sink</module>
<module>external/mqtt</module>
<module>external/zeromq</module>
</modules>
</profile>

<!-- Ganglia integration is not included by default due to LGPL-licensed code -->
<profile>
<id>spark-ganglia-lgpl</id>
Expand Down