Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make new build default and combine into dist package #3411

Merged
merged 113 commits into from Sep 10, 2021
Merged
Show file tree
Hide file tree
Changes from 71 commits
Commits
Show all changes
113 commits
Select commit Hold shift + click to select a range
53f665b
Update to cudf conditional join change that removes null equality arg…
jlowe Aug 25, 2021
b80cb31
dist initial working
tgravescs Sep 2, 2021
5e1cacb
add classes files
tgravescs Sep 2, 2021
07526a3
Parallel world ServiceLoader
gerashegalov Sep 3, 2021
a8f5949
Implement per-shim parallel world jar classloader
gerashegalov Sep 3, 2021
8ffc90e
working
tgravescs Sep 3, 2021
8cb180e
checkpoint
tgravescs Sep 3, 2021
d2453b7
Merge branch '3381' into builddistnewwithclass
tgravescs Sep 3, 2021
9ee5eeb
working
tgravescs Sep 3, 2021
0d5b11c
review and test fix
gerashegalov Sep 3, 2021
fbacdbe
Adding shuffle manager
gerashegalov Sep 4, 2021
72dd879
Merge remote-tracking branch 'origin/branch-21.10' into shimLoader320
gerashegalov Sep 4, 2021
59503cd
wip
gerashegalov Sep 4, 2021
aca6e8e
test fix
gerashegalov Sep 4, 2021
9fbe630
test fix
gerashegalov Sep 4, 2021
fb0399a
Merge branch '3381' into builddistnewwithclass2
tgravescs Sep 7, 2021
3331515
working
tgravescs Sep 7, 2021
4e24306
really working
tgravescs Sep 7, 2021
34dd324
cleanup old profiles
tgravescs Sep 7, 2021
6ab79ad
cdh building again
tgravescs Sep 7, 2021
4d8df72
format
tgravescs Sep 7, 2021
9ffe27d
Working on 311db
tgravescs Sep 7, 2021
b8513e5
fix int tests
tgravescs Sep 7, 2021
df4d48e
fix individual
tgravescs Sep 7, 2021
662a76f
rename databricks modules to conform
tgravescs Sep 7, 2021
120d359
add snapshot and no snapshot dist
tgravescs Sep 7, 2021
ca832e7
Merge branch 'builddistnewwithclass2' of github.com:tgravescs/spark-r…
tgravescs Sep 7, 2021
c6611ff
add file
tgravescs Sep 7, 2021
64bf62d
301 databricks building
tgravescs Sep 7, 2021
1d7abb3
remove aggregator
tgravescs Sep 7, 2021
ac74331
remove aggregator dir
tgravescs Sep 7, 2021
cdbe13d
remove profiles db
tgravescs Sep 7, 2021
8974e60
update build scripts
tgravescs Sep 7, 2021
8e8d61a
cleanup
tgravescs Sep 7, 2021
b5a30b8
cleanup
tgravescs Sep 7, 2021
6c4c788
format
tgravescs Sep 7, 2021
c1a104f
fix cdh
tgravescs Sep 7, 2021
dd2660a
Fix xdist
tgravescs Sep 7, 2021
43fab8a
fix another xdist
tgravescs Sep 7, 2021
381c624
update if modified and move cloudera repo
tgravescs Sep 7, 2021
6b82784
Merge branch 'builddistnewwithclass2' of github.com:tgravescs/spark-r…
tgravescs Sep 7, 2021
c823f8a
Merge remote-tracking branch 'origin/branch-21.10' into builddistneww…
tgravescs Sep 7, 2021
85418a3
3.1.1
tgravescs Sep 7, 2021
451c293
fiiles for 3.0.1
tgravescs Sep 7, 2021
ee30c59
fix build scripts
tgravescs Sep 7, 2021
3194a65
make files or snapshot and nonsnapshot
tgravescs Sep 7, 2021
2ce74fc
reviews
gerashegalov Sep 7, 2021
34cb2ed
fixes and build updates
tgravescs Sep 7, 2021
0e5ad4e
commens
gerashegalov Sep 7, 2021
e59f44f
Merge remote-tracking branch 'origin/branch-21.10' into shimLoader320
gerashegalov Sep 7, 2021
e47c4fd
dist package updates and cleanup
tgravescs Sep 8, 2021
3746e9c
update comment
tgravescs Sep 8, 2021
67e6c52
add missing files
tgravescs Sep 8, 2021
ae339a7
working and jar complete
tgravescs Sep 8, 2021
ef7f5f3
rename file and include maven in base
tgravescs Sep 8, 2021
7ad7d94
Merge branch '3381' into builddistnewwithclass2
tgravescs Sep 8, 2021
c8a9772
fix comment type
tgravescs Sep 8, 2021
d581ba8
add clean to build script
tgravescs Sep 8, 2021
f6c2980
Add docs to contributing
tgravescs Sep 8, 2021
aa472c9
remove shimmed lcasses
tgravescs Sep 8, 2021
04ba876
doc buildver
tgravescs Sep 8, 2021
bd47113
make sure python files includes in base
tgravescs Sep 8, 2021
317c051
fix premerge build script
tgravescs Sep 8, 2021
32c78c8
add readme for dist
tgravescs Sep 8, 2021
792e83f
quotes
tgravescs Sep 8, 2021
ec1c539
put spark320 into buildall
tgravescs Sep 8, 2021
3bf66f6
add README.md to rat exclude
tgravescs Sep 8, 2021
6057940
rename dist profiles and remove extra plugin
tgravescs Sep 8, 2021
c46a0c4
fix spark320 directory missing
tgravescs Sep 8, 2021
d38bfee
Update contributing doc and fix typo
tgravescs Sep 8, 2021
a041fe6
review comments
tgravescs Sep 8, 2021
56effd6
nightly updates
tgravescs Sep 8, 2021
21873ef
ProxyShuffleManager cannot be dedupped, so undoing
gerashegalov Sep 9, 2021
c89a65c
dir cleanup
gerashegalov Sep 9, 2021
bf79e74
Merge remote-tracking branch 'origin/branch-21.10' into shimLoader320
gerashegalov Sep 9, 2021
871e12d
source code layout doc
gerashegalov Sep 9, 2021
b3a4df7
than
tgravescs Sep 9, 2021
ed24fe9
Merge branch 'builddistnewwithclass2' of github.com:tgravescs/spark-r…
tgravescs Sep 9, 2021
698363e
Merge branch '3381' into builddistnewwithclass2
tgravescs Sep 9, 2021
0da6108
fixes after upmerge
tgravescs Sep 9, 2021
51db94e
changes to match new directory naming and update databricks
tgravescs Sep 9, 2021
5c4b5fb
Fix databricks 311 build
tgravescs Sep 9, 2021
b787b29
Fix databricks 301 build
tgravescs Sep 9, 2021
b6bc59d
upmerge fixes
tgravescs Sep 9, 2021
15d534f
Merge branch 'builddistnewwithclass2' of github.com:tgravescs/spark-r…
tgravescs Sep 9, 2021
1f83a3d
update readme
tgravescs Sep 9, 2021
9a43800
preview
gerashegalov Sep 9, 2021
6ec561a
whitespace
gerashegalov Sep 9, 2021
44fbfac
inclusive
gerashegalov Sep 9, 2021
b70d527
properly generate dependency reduced pom and fix the shuffle class
tgravescs Sep 9, 2021
7d43de1
skip spark 320 tests in nightly since fail
tgravescs Sep 9, 2021
a7db8e7
Merge branch '3381' into builddistnewwithclass2
tgravescs Sep 9, 2021
d905db3
scalastyle violations
gerashegalov Sep 9, 2021
9ab165f
Merge branch 'branch-21.10' into ast-remove-nulleq
jlowe Sep 9, 2021
a094c94
xfail test to unblock build
jlowe Sep 9, 2021
371329f
Merge branch '3381' into builddistnewwithclass2
tgravescs Sep 9, 2021
c4060a7
Merge branch '3303' into builddistnewwithclass2
tgravescs Sep 9, 2021
9411c30
lower case uber and have rat check run in build script
tgravescs Sep 9, 2021
62e7da8
Merge remote-tracking branch 'origin/branch-21.10' into builddistneww…
tgravescs Sep 9, 2021
60a3faa
fix upmerge
tgravescs Sep 9, 2021
5c9936a
Merge remote-tracking branch 'origin/branch-21.10' into builddistneww…
tgravescs Sep 10, 2021
78596ca
revert databrics build version
tgravescs Sep 10, 2021
7dadfb7
Fix databricks after upmerge
tgravescs Sep 10, 2021
ee15d31
comment out failing test to investigate after
tgravescs Sep 10, 2021
9ded15a
switch premerge to 311 so tools test run
tgravescs Sep 10, 2021
48f281e
wip
gerashegalov Sep 10, 2021
2297962
Add RapidsShuffleHeartbeatHandler to list to not shim
tgravescs Sep 10, 2021
b42c158
fix configs.md doc
tgravescs Sep 10, 2021
339d10f
add clean to nightly
tgravescs Sep 10, 2021
895216b
fix inclusion
tgravescs Sep 10, 2021
f20c79d
Merge branch 'ShuffleManager-fix' into builddistnewwithclass2+gerafixes
gerashegalov Sep 10, 2021
0d078a7
Fix classloading via ProxyRapidsShuffleManager
gerashegalov Sep 10, 2021
b13a61c
Merge pull request #3 from gerashegalov/builddistnewwithclass2+gerafixes
tgravescs Sep 10, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
44 changes: 38 additions & 6 deletions CONTRIBUTING.md
Expand Up @@ -39,14 +39,46 @@ mvn verify
```

After a successful build the RAPIDS Accelerator jar will be in the `dist/target/` directory.
This will build the plugin for a single version of Spark. By default this is Apache Spark
3.0.1. To build against other versions of Spark you use the `-Dbuildver=XXX` command line option
to Maven. For instance to build Spark 3.1.1 you would use:

### Building Shims for Spark Snapshot Versions
```shell script
mvn -Dbuildver=311 verify
```
You can find all available build versions in the top level pom.xml file. If you are building
for Databricks then you should use the `jenkins/databricks/build.sh` script and modify it for
the version you want.

To get an uber jar with more then 1 version you have to `mvn install` each version
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

?

Suggested change
To get an uber jar with more then 1 version you have to `mvn install` each version
To get an uber jar with more than 1 version you have to `mvn install` each version

and then use one of the defined profiles in the dist module. See the next section
for more details.

### Building a Distribution for Multiple Versions of Spark

By default the distribution jar only includes code for a single version of Spark. If you want
to create a jar with multiple versions we currently have 4 options.

1. Build for all Apache Spark versions and CDH with no SNAPSHOT versions of Spark, only released. Use `-PnoSnapshots`.
2. Build for all Apache Spark versions and CDH including SNAPSHOT versions of Spark we have supported for. Use `-Psnapshots`.
3. Build for all Apache Spark versions, CDH and Databricks with no SNAPSHOT versions of Spark, only released. Use `-PnoSnaphsotsWithDatabricks`.
4. Build for all Apache Spark versions, CDH and Databricks including SNAPSHOT versions of Spark we have supported for. Use `-PsnapshotsWithDatabricks`

By default the build will only include shims for released versions of Spark. To include shims
for snapshot versions of Spark still under development, use the `snapshot-shims` Maven profile
(e.g.: add `-Psnapshot-shims` to the Maven command-line). Note that when a snapshot Spark version
later becomes an official release, the snapshot shim for that version may no longer build due to
missing snapshot artifacts for that Spark version.
You must first build and install each of the versions of Spark and then build one final time using the profile for the option you want.

There is a build script `build/buildall` to build everything with snapshots and this will have more options to build later.

You can also install some manually and build a combined jar. For instance to build non-snapshot versions:

```shell script
mvn -Dbuildver=301 clean install -DskipTests
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is -Drat.skip=true not needed for building 301 ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we want it to run with one of the builds

mvn -Dbuildver=302 clean install -Drat.skip=true -DskipTests
mvn -Dbuildver=303 clean install -Drat.skip=true -DskipTests
mvn -Dbuildver=311 clean install -Drat.skip=true -DskipTests
mvn -Dbuildver=312 clean install -Drat.skip=true -DskipTests
mvn -Dbuildver=311cdh clean install -Drat.skip=true -DskipTests
mvn -pl dist -PnoSnapshots package -DskipTests
```

### Building against different CUDA Toolkit versions

Expand Down
145 changes: 145 additions & 0 deletions aggregator/pom.xml
@@ -0,0 +1,145 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2021, NVIDIA CORPORATION.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<parent>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-parent</artifactId>
<version>21.10.0-SNAPSHOT</version>
</parent>
<artifactId>rapids-4-spark-aggregator_2.12</artifactId>
<name>RAPIDS Accelerator for Apache Spark Aggregator</name>
<description>Creates an aggregated shaded package of the RAPIDS plugin for Apache Spark</description>
<version>21.10.0-SNAPSHOT</version>

<properties>
<rapids.shade.package>com.nvidia.shaded.${spark.version.classifier}.spark</rapids.shade.package>
</properties>
<dependencies>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-sql_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<classifier>${spark.version.classifier}</classifier>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shuffle_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<classifier>${spark.version.classifier}</classifier>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-udf_${scala.binary.version}</artifactId>
<version>${project.version}</version>
<classifier>${spark.version.classifier}</classifier>
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-${spark.version.classifier}_${scala.binary.version}</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<configuration>
<artifactSet>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: Indention appears to be off here.

<excludes>org.slf4j:*</excludes>
</artifactSet>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
<createDependencyReducedPom>true</createDependencyReducedPom>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>${spark.version.classifier}</shadedClassifierName>
<relocations>
<relocation>
<pattern>org.apache.orc.</pattern>
<shadedPattern>${rapids.shade.package}.orc.</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.hadoop.hive.</pattern>
<shadedPattern>${rapids.shade.package}.hadoop.hive.</shadedPattern>
<excludes>
<exclude>org.apache.hadoop.hive.conf.HiveConf</exclude>
<exclude>org.apache.hadoop.hive.ql.exec.UDF</exclude>
<exclude>org.apache.hadoop.hive.ql.udf.generic.GenericUDF</exclude>
</excludes>
</relocation>
<relocation>
<pattern>org.apache.hive.</pattern>
<shadedPattern>${rapids.shade.package}.hive.</shadedPattern>
</relocation>
<relocation>
<pattern>io.airlift.compress.</pattern>
<shadedPattern>${rapids.shade.package}.io.airlift.compress.</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.commons.codec.</pattern>
<shadedPattern>${rapids.shade.package}.org.apache.commons.codec.</shadedPattern>
</relocation>
<relocation>
<pattern>org.apache.commons.lang.</pattern>
<shadedPattern>${rapids.shade.package}.org.apache.commons.lang.</shadedPattern>
</relocation>
<relocation>
<pattern>com.google</pattern>
<shadedPattern>${rapids.shade.package}.com.google</shadedPattern>
</relocation>
</relocations>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>com.nvidia:rapids-4-spark-aggregator_2.12</artifact>
<includes>
<include>META-INF/**</include>
</includes>
<excludes>
<exclude>META-INF/services/**</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.rat</groupId>
<artifactId>apache-rat-plugin</artifactId>
</plugin>
</plugins>
</build>

</project>
2 changes: 1 addition & 1 deletion api_validation/pom.xml
Expand Up @@ -135,7 +135,7 @@
</dependency>
<dependency>
<groupId>com.nvidia</groupId>
<artifactId>rapids-4-spark-shims-aggregator_${scala.binary.version}</artifactId>
<artifactId>rapids-4-spark-shims-${spark.version.classifier}_${scala.binary.version}</artifactId>
<version>21.10.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
Expand Down
29 changes: 29 additions & 0 deletions build/buildall
@@ -0,0 +1,29 @@
#!/bin/bash
#
# Copyright (c) 2021, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#

set -ex

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we do something to make sure that the directory is the correct one? This assumes that you are in the root directory calling build/buildall.

# Install all the versions we support
mvn -U -Dbuildver=302 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure if this was discussed before, but if we don't want rat, or tests or javadocs/etc, why do we have them on by default in the maven build at all? I would much rather see a way to pass parameters to the shell script on to the maven build so I can decide what I want and what I don't instead.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah the build script is very basic right now just to get started, lots of improvements to it need to be done

mvn -U -Dbuildver=303 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=304 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=311 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=312 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=313 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=311cdh clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=320 clean install -Drat.skip=true -DskipTests -Dmaven.javadoc.skip=true -Dskip -pl aggregator -am
mvn -U -Dbuildver=301 clean install -Drat.skip=true -DskipTests -Psnapshots
jlowe marked this conversation as resolved.
Show resolved Hide resolved
31 changes: 31 additions & 0 deletions dist/README.md
@@ -0,0 +1,31 @@
---
layout: page
title: Testing
nav_order: 1
parent: Developer Overview
---
# RAPIDS Accelerator for Apache Spark Distribution Packaging

The distribution module creates a jar with support for the Spark versions you need combined into a single jar.

See the [CONTRIBUTING.md](../CONTRIBUTING.md) doc for details on building and profiles available to build an Uber jar.
jlowe marked this conversation as resolved.
Show resolved Hide resolved

Note that when you use the profiles to build an Uber jar there are currently some hardcoded service provider files that get put into place. One file for each of the
above profiles. Please note that you will need to update these if adding or removing support for a Spark version.

Files are: `com.nvidia.spark.rapids.SparkShimServiceProvider.sparkNonSnapshot`, `com.nvidia.spark.rapids.SparkShimServiceProvider.sparkSnapshot`, `com.nvidia.spark.rapids.SparkShimServiceProvider.sparkNonSnapshotDB`, and `com.nvidia.spark.rapids.SparkShimServiceProvider.sparkSnapshotDB`.

The new Uber jar is structured like:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The new Uber jar is structured like:
The new uber jar is structured like:


1. Base common classes are user visible classes. For these we use Spark 3.0.1 versions
2. META-INF/services. This is a file that has to list all the shim versions supported by this jar. The files talked about above for each profile are put into place here for uber jars.
3. META-INF base files are from 3.0.1 - maven, LICENSE, NOTICE, etc
4. shaded dependencies for Spark 3.0.1 in case the base common classes needed them.
5. Spark specific directory for each version of Spark supported in the jar. ie spark301/, spark302/, spark311/, etc.

If you have to change the contents of the uber jar the following files control what goes into the base jar as classes that are not shaded.

1. `unshimmed-base-classes.txt` - this has classes that should go into the base jar with their normal package name (not shaded). This includes user visible classes (ie com/nvidia/spark/SQLPlugin). Uses Spark 3.0.1 built jar for any of the uber jars.
2. `unshimmed-base-extras.txt` - these are other files applied to the base version of Spark that stay in the base of the jar and not put into Spark version specific directories. Note we choose Spark 3.0.1 as the base version to use for base and unshimmed classes.
3. `unshimmed-extras.txt` - These are files that are put into the base of the jar and not into the Spark specific directory from all of the other Spark version jars.
jlowe marked this conversation as resolved.
Show resolved Hide resolved

@@ -0,0 +1,6 @@
com.nvidia.spark.rapids.shims.spark301.SparkShimServiceProvider
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note hardcoded these for now, hopefully can make this smarter and generate on the fly later.

com.nvidia.spark.rapids.shims.spark302.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark303.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark312.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311cdh.SparkShimServiceProvider
@@ -0,0 +1,8 @@
com.nvidia.spark.rapids.shims.spark301.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark302.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark303.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark312.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311cdh.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark301db.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311db.SparkShimServiceProvider
@@ -0,0 +1,9 @@
com.nvidia.spark.rapids.shims.spark301.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark302.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark303.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark304.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark312.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark313.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311cdh.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark320.SparkShimServiceProvider
@@ -0,0 +1,11 @@
com.nvidia.spark.rapids.shims.spark301.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark302.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark303.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark304.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark312.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark313.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311cdh.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark320.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark301db.SparkShimServiceProvider
com.nvidia.spark.rapids.shims.spark311db.SparkShimServiceProvider