Skip to content

Commit

Permalink
Update readme (#21)
Browse files Browse the repository at this point in the history
  • Loading branch information
foxish committed Jan 8, 2018
1 parent 83544dd commit d5e7edb
Showing 1 changed file with 41 additions and 5 deletions.
46 changes: 41 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,45 @@ title: Spark on Kubernetes Integration Tests
# Running the Kubernetes Integration Tests

Note that the integration test framework is currently being heavily revised and
is subject to change.
is subject to change. Note that currently the integration tests only run with Java 8.

Note that currently the integration tests only run with Java 8.
As shorthand to run the tests against any given cluster, you can use the `e2e/runner.sh` script.
The script assumes that you have a functioning Kubernetes cluster (1.6+) with kubectl
configured to access it. The master URL of the currently configured cluster on your
machine can be discovered as follows:

```
$ kubectl cluster-info
Kubernetes master is running at https://xyz
```

If you want to use a local [minikube](https://github.com/kubernetes/minikube) cluster,
the minimum tested version is 0.23.0, with the kube-dns addon enabled
and the recommended configuration is 3 CPUs and 4G of memory. There is also a wrapper
script for running on minikube, `e2e/e2e-minikube.sh` for testing the apache/spark repo
in specific.

```
$ minikube start --memory 4000 --cpus 3
```

If you're using a non-local cluster, you must provide an image repository
which you have write access to, using the `-i` option, in order to store docker images
generated during the test.

Example usages of the script:

```
$ ./e2e/runner.sh -m https://xyz -i docker.io/foxish -d cloud
$ ./e2e/runner.sh -m https://xyz -i test -d minikube
$ ./e2e/runner.sh -m https://xyz -i test -r https://github.com/my-spark/spark -d minikube
```

# Detailed Documentation

## Running the tests using maven

Running the integration tests requires a Spark distribution package tarball that
contains Spark jars, submission clients, etc. You can download a tarball from
Expand Down Expand Up @@ -40,7 +76,7 @@ $ mvn clean integration-test \
-Dspark-distro-tgz=spark/spark-2.3.0-SNAPSHOT-bin.tgz
```

# Running against an arbitrary cluster
## Running against an arbitrary cluster

In order to run against any cluster, use the following:
```sh
Expand All @@ -49,7 +85,7 @@ $ mvn clean integration-test \
-DextraScalaTestArgs="-Dspark.kubernetes.test.master=k8s://https://<master> -Dspark.docker.test.driverImage=<driver-image> -Dspark.docker.test.executorImage=<executor-image>"
```

# Preserve the Minikube VM
## Preserve the Minikube VM

The integration tests make use of
[Minikube](https://github.com/kubernetes/minikube), which fires up a virtual
Expand All @@ -64,7 +100,7 @@ $ mvn clean integration-test \
-DextraScalaTestArgs=-Dspark.docker.test.persistMinikube=true
```

# Reuse the previous Docker images
## Reuse the previous Docker images

The integration tests build a number of Docker images, which takes some time.
By default, the images are built every time the tests run. You may want to skip
Expand Down

0 comments on commit d5e7edb

Please sign in to comment.