Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/master' into topByKey
Browse files Browse the repository at this point in the history
  • Loading branch information
coderxiang committed Mar 12, 2015
2 parents debccad + 8f1bc79 commit b10e325
Show file tree
Hide file tree
Showing 304 changed files with 10,309 additions and 3,381 deletions.
1 change: 1 addition & 0 deletions .gitignore
Expand Up @@ -6,6 +6,7 @@
*.iml
*.iws
*.pyc
*.pyo
.idea/
.idea_modules/
build/*.jar
Expand Down
16 changes: 16 additions & 0 deletions LICENSE
Expand Up @@ -771,6 +771,22 @@ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

========================================================================
For TestTimSort (core/src/test/java/org/apache/spark/util/collection/TestTimSort.java):
========================================================================
Copyright (C) 2015 Stijn de Gouw

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

========================================================================
For LimitedInputStream
Expand Down
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -85,7 +85,7 @@ storage systems. Because the protocols have changed in different versions of
Hadoop, you must build Spark against the same version that your cluster runs.

Please refer to the build documentation at
["Specifying the Hadoop Version"](http://spark.apache.org/docs/latest/building-with-maven.html#specifying-the-hadoop-version)
["Specifying the Hadoop Version"](http://spark.apache.org/docs/latest/building-spark.html#specifying-the-hadoop-version)
for detailed guidance on building for a particular distribution of Hadoop, including
building for particular Hive and Hive Thriftserver distributions. See also
["Third Party Hadoop Distributions"](http://spark.apache.org/docs/latest/hadoop-third-party-distributions.html)
Expand Down
12 changes: 1 addition & 11 deletions assembly/pom.xml
Expand Up @@ -20,7 +20,7 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent</artifactId>
<artifactId>spark-parent_2.10</artifactId>
<version>1.3.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
Expand Down Expand Up @@ -114,16 +114,6 @@
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
<filter>
<!-- Exclude libgfortran, libgcc for license issues -->
<artifact>org.jblas:jblas</artifact>
<excludes>
<!-- Linux amd64 is OK; not statically linked -->
<exclude>lib/static/Linux/i386/**</exclude>
<exclude>lib/static/Mac OS X/**</exclude>
<exclude>lib/static/Windows/**</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
Expand Down
2 changes: 1 addition & 1 deletion bagel/pom.xml
Expand Up @@ -20,7 +20,7 @@
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.apache.spark</groupId>
<artifactId>spark-parent</artifactId>
<artifactId>spark-parent_2.10</artifactId>
<version>1.3.0-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
Expand Down
124 changes: 0 additions & 124 deletions bin/compute-classpath.cmd

This file was deleted.

161 changes: 0 additions & 161 deletions bin/compute-classpath.sh

This file was deleted.

8 changes: 4 additions & 4 deletions bin/load-spark-env.sh
Expand Up @@ -41,9 +41,9 @@ fi

if [ -z "$SPARK_SCALA_VERSION" ]; then

ASSEMBLY_DIR2="$FWDIR/assembly/target/scala-2.11"
ASSEMBLY_DIR1="$FWDIR/assembly/target/scala-2.10"
ASSEMBLY_DIR2="$SPARK_HOME/assembly/target/scala-2.11"
ASSEMBLY_DIR1="$SPARK_HOME/assembly/target/scala-2.10"

if [[ -d "$ASSEMBLY_DIR2" && -d "$ASSEMBLY_DIR1" ]]; then
echo -e "Presence of build for both scala versions(SCALA 2.10 and SCALA 2.11) detected." 1>&2
echo -e 'Either clean one of them or, export SPARK_SCALA_VERSION=2.11 in spark-env.sh.' 1>&2
Expand All @@ -54,5 +54,5 @@ if [ -z "$SPARK_SCALA_VERSION" ]; then
export SPARK_SCALA_VERSION="2.11"
else
export SPARK_SCALA_VERSION="2.10"
fi
fi
fi

0 comments on commit b10e325

Please sign in to comment.