Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#40: Refactored the test environment #87

Merged
merged 6 commits into from
Feb 18, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,17 +17,16 @@ scala:

env:
- SPARK_VERSION="2.4.5" EXASOL_DOCKER_VERSION="6.2.12-d1"
- SPARK_VERSION="2.4.5" EXASOL_DOCKER_VERSION="7.0.4"
- SPARK_VERSION="2.4.5" EXASOL_DOCKER_VERSION="7.0.6"
- SPARK_VERSION="3.0.1" EXASOL_DOCKER_VERSION="6.2.12-d1"
- SPARK_VERSION="3.0.1" EXASOL_DOCKER_VERSION="7.0.4"
- SPARK_VERSION="3.0.1" EXASOL_DOCKER_VERSION="7.0.6"

before_install:
- git fetch --tags
- docker pull "exasol/docker-db:$EXASOL_DOCKER_VERSION"
- docker network create -d bridge --subnet 192.168.0.0/24 --gateway 192.168.0.1 dockernet

script:
- travis_wait 30 ./scripts/ci.sh
- ./scripts/ci.sh

after_success:
- bash <(curl -s https://codecov.io/bash)
Expand Down
36 changes: 21 additions & 15 deletions doc/changes/changes_1.0.0.md
Original file line number Diff line number Diff line change
@@ -1,33 +1,39 @@
# Spark Exasol Connector 1.0.0, released 2020-12-DD
# Spark Exasol Connector 1.0.0, released 2021-MM-DD

## Features / Improvements

## Refactoring

* #40: Added Exasol testcontainers, refactored test environment (PR #87).

## Documentation

* #85: Updated documentation with configuration for the Databricks cluster (PR #86)
* #85: Updated documentation with configuration for the Databricks cluster (PR #86).

## Dependency Updates

### Runtime Dependency Updates

* Updated to `com.exasol:exasol-jdbc:7.0.4` (was `7.0.0`)
* Updated to `org.apache.spark:spark-core:3.0.1` (was `2.4.5`)
* Updated to `org.apache.spark:spark-sql:3.0.1` (was `2.4.5`)
* Updated `com.exasol:exasol-jdbc:7.0.0` to `7.0.7`
* Updated `org.apache.spark:spark-core:2.4.5` to `3.0.1`
* Updated `org.apache.spark:spark-sql:2.4.5` to `3.0.1`

### Test Dependency Updates

* Updated to `org.scalatest:scalatest:3.2.2` (was `3.2.2`)
* Updated to `org.testcontainers:jdbc:1.15.0` (was `1.14.3`)
* Updated to `com.holdenkarau:spark-testing-base:3.0.1_1.0.0` (was `2.4.5_0.14.0)
* Updated to `org.mockito:mockito-core:3.6.28` (was `3.5.13`)
* Updated to `com.dimafeng:testcontainers-scala:0.38.7` (was `0.38.4`)
* Added `com.exasol:exasol-testcontainers:3.5.0`
* Added `com.exasol:test-db-builder-java:3.0.0`
* Added `com.exasol:hamcrest-resultset-matcher:1.4.0`
* Removed `org.testcontainers:jdbc`
* Removed `com.dimafeng:testcontainers-scala`
* Updated `org.scalatest:scalatest:3.2.2` to `3.2.4`
* Updated `org.mockito:mockito-core:3.5.13` to `3.7.7`
* Updated `com.holdenkarau:spark-testing-base:2.4.5_0.14.0` to `3.0.1_1.0.0`

### Plugin Updates

* Updated to `sbt.version:1.4.4` (was `1.3.13`)
* Updated to `org.wartremover:sbt-wartremover:2.4.13` (was `2.4.10`)
* Updated to `org.wartremover:sbt-wartremover-contrib:1.3.11` (was `1.3.8`)
* Updated to `com.jsuereth:sbt-pgp:2.0.2` (was `2.0.1`)
* Updated to `org.xerial.sbt:sbt-sonatype:3.9.5` (was `3.9.4`)
* Updated `sbt.version:1.3.13` to `1.4.7`
* Updated `org.wartremover:sbt-wartremover:2.4.10` to `2.4.13`
* Updated `org.wartremover:sbt-wartremover-contrib:1.3.8` to `1.3.11`
* Updated `com.jsuereth:sbt-pgp:2.0.1` to `2.1.1`
* Updated `org.xerial.sbt:sbt-sonatype:3.9.4` to `3.9.5`
* Removed `io.get-coursier:sbt-coursier`
16 changes: 3 additions & 13 deletions doc/development/developer_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,10 @@ Please read the general [developer guide for the Scala projects][dev-guide].
## Integration Tests

The integration tests are run using [Docker][docker] containers. The tests use
[exasol/docker-db][exa-docker-db], [testcontainers][testcontainers] and
[exasol-testcontainers][exa-testcontainers] and
[spark-testing-base][spark-testing-base].

To run integration tests, a separate docker network should be created first:

```bash
docker network create -d bridge --subnet 192.168.0.0/24 --gateway 192.168.0.1 dockernet
```

The docker network is required since we connect to the Exasol docker container
using an internal IPv4 address.

[dev-guide]: https://github.com/exasol/import-export-udf-common-scala/blob/master/doc/development/developer_guide.md
[docker]: https://www.docker.com/
[exa-docker-db]: https://hub.docker.com/r/exasol/docker-db/
[testcontainers]: https://www.testcontainers.org/
[exa-testcontainers]: https://github.com/exasol/exasol-testcontainers/
[spark-testing-base]: https://github.com/holdenk/spark-testing-base
[dev-guide]: https://github.com/exasol/import-export-udf-common-scala/blob/master/doc/development/developer_guide.md
18 changes: 10 additions & 8 deletions project/Dependencies.scala
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,14 @@ object Dependencies {

// Versions
private val DefaultSparkVersion = "3.0.1"
private val ExasolJdbcVersion = "7.0.4"
private val ExasolJdbcVersion = "7.0.7"

private val ScalaTestVersion = "3.2.3"
private val ScalaTestVersion = "3.2.4"
private val ScalaTestMockitoVersion = "1.0.0-M2"
private val MockitoVersion = "3.6.28"
private val ContainersJdbcVersion = "1.15.0"
private val ContainersScalaVersion = "0.38.7"
private val MockitoVersion = "3.7.7"
private val ExasolTestContainersVersion = "3.5.0"
private val ExasolTestDBBuilderVersion = "3.0.0"
private val ExasolHamcrestMatcherVersion = "1.4.0"

private val sparkCurrentVersion =
sys.env.getOrElse("SPARK_VERSION", DefaultSparkVersion)
Expand All @@ -36,9 +37,10 @@ object Dependencies {
"org.scalatest" %% "scalatest" % ScalaTestVersion,
"org.scalatestplus" %% "scalatestplus-mockito" % ScalaTestMockitoVersion,
"org.mockito" % "mockito-core" % MockitoVersion,
"org.testcontainers" % "jdbc" % ContainersJdbcVersion,
"com.dimafeng" %% "testcontainers-scala" % ContainersScalaVersion,
"com.holdenkarau" %% "spark-testing-base" % SparkTestingBaseVersion
"com.holdenkarau" %% "spark-testing-base" % SparkTestingBaseVersion,
"com.exasol" % "exasol-testcontainers" % ExasolTestContainersVersion,
"com.exasol" % "test-db-builder-java" % ExasolTestDBBuilderVersion,
"com.exasol" % "hamcrest-resultset-matcher" % ExasolHamcrestMatcherVersion,
).map(_ % Test)

/** The list of all dependencies for the connector */
Expand Down
2 changes: 1 addition & 1 deletion project/build.properties
Original file line number Diff line number Diff line change
@@ -1 +1 @@
sbt.version=1.4.4
sbt.version=1.4.7
2 changes: 1 addition & 1 deletion project/plugins.sbt
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ addSbtPlugin("org.xerial.sbt" % "sbt-sonatype" % "3.9.5")

// Adds a `gnu-pgp` plugin
// https://github.com/sbt/sbt-pgp
addSbtPlugin("com.jsuereth" % "sbt-pgp" % "2.0.2")
addSbtPlugin("com.jsuereth" % "sbt-pgp" % "2.1.1")

// Adds a `git` plugin
// https://github.com/sbt/sbt-git
Expand Down
17 changes: 12 additions & 5 deletions sbtx
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@

set -o pipefail

declare -r sbt_release_version="1.4.4"
declare -r sbt_unreleased_version="1.4.4"
declare -r sbt_release_version="1.4.7"
declare -r sbt_unreleased_version="1.4.7"

declare -r latest_213="2.13.4"
declare -r latest_212="2.12.12"
Expand All @@ -48,7 +48,7 @@ declare -r buildProps="project/build.properties"

declare -r sbt_launch_ivy_release_repo="https://repo.typesafe.com/typesafe/ivy-releases"
declare -r sbt_launch_ivy_snapshot_repo="https://repo.scala-sbt.org/scalasbt/ivy-snapshots"
declare -r sbt_launch_mvn_release_repo="https://repo.scala-sbt.org/scalasbt/maven-releases"
declare -r sbt_launch_mvn_release_repo="https://repo1.maven.org/maven2"
declare -r sbt_launch_mvn_snapshot_repo="https://repo.scala-sbt.org/scalasbt/maven-snapshots"

declare -r default_jvm_opts_common="-Xms512m -Xss2m -XX:MaxInlineLevel=18"
Expand Down Expand Up @@ -167,7 +167,7 @@ make_url() {
0.10.*) echo "$base/org.scala-tools.sbt/sbt-launch/$version/sbt-launch.jar" ;;
0.11.[12]) echo "$base/org.scala-tools.sbt/sbt-launch/$version/sbt-launch.jar" ;;
0.*) echo "$base/org.scala-sbt/sbt-launch/$version/sbt-launch.jar" ;;
*) echo "$base/org/scala-sbt/sbt-launch/$version/sbt-launch-${version}.jar" ;;
*) echo "$base/org/scala-sbt/sbt-launch/$version/sbt-launch.jar" ;;
esac
}

Expand Down Expand Up @@ -247,11 +247,18 @@ java_version() {
echo "$version"
}

is_apple_silicon() { [[ "$(uname -s)" == "Darwin" && "$(uname -m)" == "arm64" ]]; }

# MaxPermSize critical on pre-8 JVMs but incurs noisy warning on 8+
default_jvm_opts() {
local -r v="$(java_version)"
if [[ $v -ge 10 ]]; then
echo "$default_jvm_opts_common -XX:+UnlockExperimentalVMOptions -XX:+UseJVMCICompiler"
if is_apple_silicon; then
# As of Dec 2020, JVM for Apple Silicon (M1) doesn't support JVMCI
echo "$default_jvm_opts_common"
else
echo "$default_jvm_opts_common -XX:+UnlockExperimentalVMOptions -XX:+UseJVMCICompiler"
fi
elif [[ $v -ge 8 ]]; then
echo "$default_jvm_opts_common"
else
Expand Down

This file was deleted.

This file was deleted.

Loading