Skip to content

Commit

Permalink
feat(integration-tests): Add subproject for integration tests (#1285)
Browse files Browse the repository at this point in the history
Add a new supproject which contains an integration test suite to
run black box web api tests against a running jobserver deployment
(to test the full stack).

* Add new subproject job-server-integration-tests
* Add basic api tests
* Add corner cases tests
* Add tests for the synchronization between two active-active job
  servers
* Add test for an HA failover scenario
* Abstract a DeploymentController interface to generically start/
  stop instances within a test
* Make the tests highly configurable (which jobserver API addresses,
  SSL, which tests to run, how to start/stop jobservers)
* Add documentation and example configuration

Change-Id: I245cbcfc4d7bbd9dbb871a8dbd9bdfdeec1d353f
  • Loading branch information
Nibooor committed Mar 10, 2020
1 parent 61598af commit d10522c
Show file tree
Hide file tree
Showing 12 changed files with 1,218 additions and 0 deletions.
9 changes: 9 additions & 0 deletions build.sbt
Expand Up @@ -86,6 +86,10 @@ lazy val jobServerPython = Project(id = "job-server-python", base = file("job-se
.dependsOn(jobServerApi, akkaApp % "test")
.disablePlugins(SbtScalariform)

lazy val jobserverIntegrationTests = Project(id = "job-server-integration-tests", base = file("job-server-integration-tests"))
.settings(commonSettings)
.settings(jobserverIntegrationTestsSettings)

lazy val root = Project(id = "root", base = file("."))
.settings(commonSettings)
.settings(Release.settings)
Expand Down Expand Up @@ -124,6 +128,11 @@ lazy val jobServerPythonSettings = revolverSettings ++ Assembly.settings ++ publ
assembly := assembly.dependsOn(buildPython).value
)

lazy val jobserverIntegrationTestsSettings = Seq(
libraryDependencies ++= integrationTestDeps,
mainClass in Compile := Some("spark.jobserver.integrationtests.IntegrationTests"),
)

lazy val jobServerTestJarSettings = Seq(
libraryDependencies ++= sparkDeps ++ apiDeps,
description := "Test jar for Spark Job Server",
Expand Down
38 changes: 38 additions & 0 deletions job-server-integration-tests/README.md
@@ -0,0 +1,38 @@
# Jobserver Integration tests
This project contains a set of integration tests that can be run against a running deployment of jobserver.
This way it can be verified that the whole stack (including configuration, spark, dao, network) is working as intended.

## Usage
Integration tests can be run locally with a regular test runner or by invoking the provided main class `IntegrationTest`.
To run the tests against an arbitrary jobserver from a remote environment you can:
```shell
# Assemble a fat jar (from root dir)
sbt job-server-integration-tests/assembly

# Move the fat jar to a remote location
scp target/scala-*/job-server-integration-tests-assembly-*.jar <remote-path>/integration-tests.jar

# Invoke the test at the remote location
java -jar integration-tests.jar # displays usage
java -jar integration-tests.jar </path/to/config/file> # executes tests on a specific deployment

```

A configuration of the integration tests is possible by supplying a config file as parameter.
Within the config file you can specify:
* Address(es) of the jobserver deployment(s) to be tested
* Names of the test to be run
* Which deployment controller to take for HA tests (i.e. how to stop and restart jobservers for testing)
* Possibly additional fields required for tests or deployment controller

Here's a running example for a config file:
```javascript
{
// Define the addresses of to be tested jobservers in this format
jobserverAddresses: ["localhost:8090", "localhost:8091"]
// In case jobserver communicates via https
useSSL: true
// Specify which tests to run (list of concrete tests)
runTests: ["BasicApiTests", "CornerCasesTests", "TwoJobserverTests"]
}
```
Binary file not shown.
Binary file not shown.
@@ -0,0 +1,74 @@
package spark.jobserver.integrationtests

import java.io.File

import org.scalatest.ConfigMap

import com.typesafe.config.ConfigException
import com.typesafe.config.ConfigFactory
import com.typesafe.config.ConfigRenderOptions

import spark.jobserver.integrationtests.util.TestHelper

object IntegrationTests extends App {

// Parse config
if (args.length != 1) {
printUsage()
sys.exit(-1)
}
val file = new File(args(0))
if (!file.exists()) {
println(s"Could not find a config file for path ${file.getAbsolutePath}")
sys.exit(-1)
}
val config = try {
ConfigFactory.parseFile(file)
} catch {
case t: Throwable =>
println("Could not parse config file: ")
t.printStackTrace()
sys.exit(-1)
}

// Validate config
try {
val addresses = config.getStringList("jobserverAddresses")
if (addresses.isEmpty()) {
println("The list of jobserverAddresses is empty. Not running any tests.")
sys.exit(-1)
}
val testsToRun = config.getStringList("runTests")
if (testsToRun.isEmpty()) {
println("The list of tests to run is empty. Not running any tests.")
sys.exit(-1)
}
} catch {
case e: ConfigException =>
println("Invalid configuration file: " + e.getMessage)
sys.exit(-1)
}

// In case HTTPS is used, just disable verification
if (config.hasPath("useSSL") && config.getBoolean("useSSL")) {
TestHelper.disableSSLVerification()
}

// Run selected integration tests
println("Running integration tests with the following configuration:")
println(config.root().render(ConfigRenderOptions.concise().setFormatted(true).setJson(true)))
val testsToRun = config.getStringList("runTests").toArray()
testsToRun.foreach { t =>
val testName = s"spark.jobserver.integrationtests.tests.$t"
val clazz = Class.forName(testName)
val test = clazz.getDeclaredConstructor()
.newInstance().asInstanceOf[org.scalatest.Suite]
test.execute(configMap = ConfigMap(("config", config)))
}

// Usage
def printUsage() {
println("Usage: IntegrationTests </path/to/config/file>")
}

}

0 comments on commit d10522c

Please sign in to comment.