Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-14613][ML] Add @Since into the matrix and vector classes in spark-mllib-local #12416

Closed
wants to merge 11 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion common/network-common/pom.xml
Expand Up @@ -66,7 +66,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Basically, what you have to do is: everywhere you're changing this, you need to also add <classifier>tests</classifier>. In modules where both the test and non-test tags are used, you need both dependencies (with and without classifier).

It might be easier to just leave the test tags in the main source base...

</dependency>
<dependency>
<groupId>org.mockito</groupId>
Expand Down
2 changes: 1 addition & 1 deletion common/network-shuffle/pom.xml
Expand Up @@ -80,7 +80,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>log4j</groupId>
Expand Down
2 changes: 1 addition & 1 deletion common/network-yarn/pom.xml
Expand Up @@ -48,7 +48,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>

<!-- Provided dependencies -->
Expand Down
2 changes: 1 addition & 1 deletion common/sketch/pom.xml
Expand Up @@ -38,7 +38,7 @@
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>

Expand Down
6 changes: 3 additions & 3 deletions common/tags/pom.xml
Expand Up @@ -27,12 +27,12 @@
</parent>

<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_2.11</artifactId>
<artifactId>spark-tags_2.11</artifactId>
<packaging>jar</packaging>
<name>Spark Project Test Tags</name>
<name>Spark Project Tags</name>
<url>http://spark.apache.org/</url>
<properties>
<sbt.project.name>test-tags</sbt.project.name>
<sbt.project.name>tags</sbt.project.name>
</properties>

<dependencies>
Expand Down
2 changes: 1 addition & 1 deletion common/unsafe/pom.xml
Expand Up @@ -61,7 +61,7 @@
<!-- Test dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
Expand Down
2 changes: 1 addition & 1 deletion core/pom.xml
Expand Up @@ -317,7 +317,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
21 changes: 15 additions & 6 deletions dev/sparktestsupport/modules.py
Expand Up @@ -93,9 +93,18 @@ def __hash__(self):
return hash(self.name)


tags = Module(
name="tags",
dependencies=[],
source_file_regexes=[
"common/tags/",
]
)


catalyst = Module(
name="catalyst",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"sql/catalyst/",
],
Expand Down Expand Up @@ -165,7 +174,7 @@ def __hash__(self):

sketch = Module(
name="sketch",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"common/sketch/",
],
Expand All @@ -177,7 +186,7 @@ def __hash__(self):

graphx = Module(
name="graphx",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"graphx/",
],
Expand All @@ -189,7 +198,7 @@ def __hash__(self):

streaming = Module(
name="streaming",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"streaming",
],
Expand All @@ -205,7 +214,7 @@ def __hash__(self):
# fail other PRs.
streaming_kinesis_asl = Module(
name="streaming-kinesis-asl",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"external/kinesis-asl/",
"external/kinesis-asl-assembly/",
Expand Down Expand Up @@ -270,7 +279,7 @@ def __hash__(self):

mllib_local = Module(
name="mllib-local",
dependencies=[],
dependencies=[tags],
source_file_regexes=[
"mllib-local",
],
Expand Down
3 changes: 2 additions & 1 deletion external/docker-integration-tests/pom.xml
Expand Up @@ -128,9 +128,10 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<scope>test</scope>
<classifier>tests</classifier>
</dependency>
<dependency>
<groupId>mysql</groupId>
Expand Down
2 changes: 1 addition & 1 deletion external/flume-sink/pom.xml
Expand Up @@ -92,7 +92,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion external/flume/pom.xml
Expand Up @@ -68,7 +68,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion external/java8-tests/pom.xml
Expand Up @@ -72,7 +72,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>

Expand Down
2 changes: 1 addition & 1 deletion external/kafka/pom.xml
Expand Up @@ -88,7 +88,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion external/kinesis-asl/pom.xml
Expand Up @@ -77,7 +77,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion graphx/pom.xml
Expand Up @@ -72,7 +72,7 @@
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<build>
Expand Down
2 changes: 1 addition & 1 deletion launcher/pom.xml
Expand Up @@ -65,7 +65,7 @@

<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_${scala.binary.version}</artifactId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>

<!-- Not needed by the test code, but referenced by SparkSubmit which is used by the tests. -->
Expand Down
4 changes: 4 additions & 0 deletions mllib-local/pom.xml
Expand Up @@ -57,6 +57,10 @@
<artifactId>mockito-core</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-tags_${scala.binary.version}</artifactId>
</dependency>
</dependencies>
<profiles>
<profile>
Expand Down