Skip to content

Commit

Permalink
[SPARK-36166][TESTS] Support Scala 2.13 test in dev/run-tests.py
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

For Apache Spark 3.2, this PR aims to support Scala 2.13 test in `dev/run-tests.py` by adding `SCALA_PROFILE` and in `dev/run-tests-jenkins.py` by adding `AMPLAB_JENKINS_BUILD_SCALA_PROFILE`.

In addition, `test-dependencies.sh` is skipped for Scala 2.13 because we currently don't maintain the dependency manifests yet. This will be handled after Apache Spark 3.2.0 release.

### Why are the changes needed?

To test Scala 2.13 with `dev/run-tests.py`.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Manual. The following is the result. Note that this PR aims to **run** Scala 2.13 tests instead of **passing** them. We will have daily GitHub Action job via #33358 and will fix UT failures if exists.
```
$ dev/change-scala-version.sh 2.13

$ SCALA_PROFILE=scala2.13 dev/run-tests.py
...
========================================================================
Running Scala style checks
========================================================================
[info] Checking Scala style using SBT with these profiles:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pkubernetes -Phadoop-cloud -Phive -Phive-thriftserver -Pyarn -Pmesos -Pdocker-integration-tests -Pkinesis-asl -Pspark-ganglia-lgpl
...
========================================================================
Building Spark
========================================================================
[info] Building Spark using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pspark-ganglia-lgpl -Pmesos -Pyarn -Phive-thriftserver -Pkinesis-asl -Pkubernetes -Pdocker-integration-tests -Phive -Phadoop-cloud test:package streaming-kinesis-asl-assembly/assembly
...

[info] Building Spark assembly using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pspark-ganglia-lgpl -Pmesos -Pyarn -Phive-thriftserver -Pkinesis-asl -Pkubernetes -Pdocker-integration-tests -Phive -Phadoop-cloud assembly/package
...

========================================================================
Running Java style checks
========================================================================
[info] Checking Java style using SBT with these profiles:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pspark-ganglia-lgpl -Pmesos -Pyarn -Phive-thriftserver -Pkinesis-asl -Pkubernetes -Pdocker-integration-tests -Phive -Phadoop-cloud
...

========================================================================
Building Unidoc API Documentation
========================================================================
[info] Building Spark unidoc using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pspark-ganglia-lgpl -Pmesos -Pyarn -Phive-thriftserver -Pkinesis-asl -Pkubernetes -Pdocker-integration-tests -Phive -Phadoop-cloud unidoc
...

========================================================================
Running Spark unit tests
========================================================================
[info] Running Spark tests using SBT with these arguments:  -Phadoop-3.2 -Phive-2.3 -Pscala-2.13 -Pspark-ganglia-lgpl -Pmesos -Pyarn -Phive-thriftserver -Pkinesis-asl -Pkubernetes -Pdocker-integration-tests -Phive -Phadoop-cloud test
...
```

Closes #33376 from dongjoon-hyun/SPARK-36166.

Authored-by: Dongjoon Hyun <dongjoon@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
  • Loading branch information
dongjoon-hyun committed Jul 16, 2021
1 parent c22f7a4 commit f66153d
Show file tree
Hide file tree
Showing 3 changed files with 35 additions and 1 deletion.
3 changes: 3 additions & 0 deletions dev/run-tests-jenkins.py
Expand Up @@ -177,6 +177,9 @@ def main():
# Switch the Hive profile based on the PR title:
if "test-hive2.3" in ghprb_pull_title:
os.environ["AMPLAB_JENKINS_BUILD_HIVE_PROFILE"] = "hive2.3"
# Switch the Scala profile based on the PR title:
if "test-scala2.13" in ghprb_pull_title:
os.environ["AMPLAB_JENKINS_BUILD_SCALA_PROFILE"] = "scala2.13"

build_display_name = os.environ["BUILD_DISPLAY_NAME"]
build_url = os.environ["BUILD_URL"]
Expand Down
23 changes: 22 additions & 1 deletion dev/run-tests.py
Expand Up @@ -294,6 +294,24 @@ def exec_sbt(sbt_args=()):
exit_from_command_with_retcode(sbt_cmd, retcode)


def get_scala_profiles(scala_version):
"""
For the given Scala version tag, return a list of Maven/SBT profile flags for
building and testing against that Scala version.
"""
sbt_maven_scala_profiles = {
"scala2.12": ["-Pscala-2.12"],
"scala2.13": ["-Pscala-2.13"],
}

if scala_version in sbt_maven_scala_profiles:
return sbt_maven_scala_profiles[scala_version]
else:
print("[error] Could not find", scala_version, "in the list. Valid options",
" are", sbt_maven_scala_profiles.keys())
sys.exit(int(os.environ.get("CURRENT_BLOCK", 255)))


def get_hadoop_profiles(hadoop_version):
"""
For the given Hadoop version tag, return a list of Maven/SBT profile flags for
Expand Down Expand Up @@ -629,6 +647,7 @@ def main():
# if we're on the Amplab Jenkins build servers setup variables
# to reflect the environment settings
build_tool = os.environ.get("AMPLAB_JENKINS_BUILD_TOOL", "sbt")
scala_version = os.environ.get("AMPLAB_JENKINS_BUILD_SCALA_PROFILE", "scala2.12")
hadoop_version = os.environ.get("AMPLAB_JENKINS_BUILD_PROFILE", "hadoop3.2")
hive_version = os.environ.get("AMPLAB_JENKINS_BUILD_HIVE_PROFILE", "hive2.3")
test_env = "amplab_jenkins"
Expand All @@ -639,6 +658,7 @@ def main():
else:
# else we're running locally or GitHub Actions.
build_tool = "sbt"
scala_version = os.environ.get("SCALA_PROFILE", "scala2.12")
hadoop_version = os.environ.get("HADOOP_PROFILE", "hadoop3.2")
hive_version = os.environ.get("HIVE_PROFILE", "hive2.3")
if "GITHUB_ACTIONS" in os.environ:
Expand All @@ -648,7 +668,8 @@ def main():

print("[info] Using build tool", build_tool, "with Hadoop profile", hadoop_version,
"and Hive profile", hive_version, "under environment", test_env)
extra_profiles = get_hadoop_profiles(hadoop_version) + get_hive_profiles(hive_version)
extra_profiles = get_hadoop_profiles(hadoop_version) + get_hive_profiles(hive_version) + \
get_scala_profiles(scala_version)

changed_modules = []
changed_files = []
Expand Down
10 changes: 10 additions & 0 deletions dev/test-dependencies.sh
Expand Up @@ -51,6 +51,16 @@ if [ $? != 0 ]; then
echo -e "Error while getting version string from Maven:\n$OLD_VERSION"
exit 1
fi
SCALA_BINARY_VERSION=$($MVN -q \
-Dexec.executable="echo" \
-Dexec.args='${scala.binary.version}' \
--non-recursive \
org.codehaus.mojo:exec-maven-plugin:1.6.0:exec | grep -E '[0-9]+\.[0-9]+')
if [[ "$SCALA_BINARY_VERSION" != "2.12" ]]; then
# TODO(SPARK-36168) Support Scala 2.13 in dev/test-dependencies.sh
echo "Skip dependency testing on $SCALA_BINARY_VERSION"
exit 0
fi
set -e
TEMP_VERSION="spark-$(python3 -S -c "import random; print(random.randrange(100000, 999999))")"

Expand Down

0 comments on commit f66153d

Please sign in to comment.