Skip to content

Commit

Permalink
update readme (#17)
Browse files Browse the repository at this point in the history
Signed-off-by: Mathew Wicks <thesuperzapper@users.noreply.github.com>
  • Loading branch information
thesuperzapper committed Nov 23, 2020
1 parent da6b6de commit ced3f2b
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 36 deletions.
2 changes: 1 addition & 1 deletion charts/airflow/Chart.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
apiVersion: v1
description: airflow is a platform to programmatically author, schedule, and monitor workflows
name: airflow
version: 7.14.1
version: 7.14.2
appVersion: 1.10.12
icon: https://airflow.apache.org/_images/pin_large.png
home: https://airflow.apache.org/
Expand Down
49 changes: 14 additions & 35 deletions charts/airflow/README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
# Airflow Helm Chart

> ⚠️ NOTE
>
> this chart is the continuation of [stable/airflow](https://github.com/helm/charts/tree/master/stable/airflow), by the same maintainers
> ⚠️ this chart is the continuation of [stable/airflow](https://github.com/helm/charts/tree/master/stable/airflow), see [issue #6](https://github.com/airflow-helm/charts/issues/6) for upgrade guide
[Airflow](https://airflow.apache.org/) is a platform to programmatically author, schedule, and monitor workflows.

Expand Down Expand Up @@ -37,26 +35,23 @@ helm install \

### 3 - Run commands in Webserver Pod

> ⚠️ NOTE
>
> you might want to run commands like: `airflow create_user`
```sh
kubectl exec \
-it \
--namespace [NAMESPACE] \
--container airflow-web \
Deployment/[RELEASE_NAME]-web \
/bin/bash

# then run commands like
airflow create_user ...
```

---

## Upgrade Steps

> ⚠️ NOTE
>
> you can find chart version numbers under [GitHub Releases](https://github.com/airflow-helm/charts/releases)
Find chart version numbers under [GitHub Releases](https://github.com/airflow-helm/charts/releases):

- [v7.13.X → v7.14.0](UPGRADE.md#v713x--v7140)
- [v7.12.X → v7.13.0](UPGRADE.md#v712x--v7130)
Expand Down Expand Up @@ -451,9 +446,7 @@ redis:

## Docs (Database) - External Database

> ⚠️ WARNING
>
> the embedded PostgreSQL is NOT SUITABLE for production, you should configure one of the following external databases
> 🛑️️ the embedded PostgreSQL is NOT SUITABLE for production, you should configure one of the following external databases
### Option 1 - Postgres

Expand All @@ -471,9 +464,7 @@ externalDatabase:

### Option 2 - MySQL

> ⚠️ WARNING
>
> you must set `explicit_defaults_for_timestamp=1` in your MySQL instance, [see here](https://airflow.apache.org/docs/stable/howto/initialize-database.html)
> ⚠️ you must set `explicit_defaults_for_timestamp=1` in your MySQL instance, [see here](https://airflow.apache.org/docs/stable/howto/initialize-database.html)
Example values for an external MySQL database, with an existing `airflow_cluster1` database:
```yaml
Expand All @@ -491,9 +482,7 @@ externalDatabase:

## Docs (Other) - Log Persistence

> ⚠️ NOTE
>
> you will likely want to persist logs in a production deployment
> 🛑️️ you should persist logs in a production deployment using one of the following methods
By default, logs from the airflow-web/scheduler/worker are written within the Docker container's filesystem, therefore any restart of the pod will wipe the logs.

Expand Down Expand Up @@ -580,6 +569,8 @@ For more information, see the `serviceMonitor` section of `values.yaml`.

### Option 1 - Git Sidecar (Recommended)

> ⚠️ specifying `known_hosts` inside `dags.git.secret` reduces the possibility of a man-in-the-middle attack, however, if you want to implicitly trust all repo host signatures set `dags.git.sshKeyscan` to `true`
This method places a git sidecar in each worker/scheduler/web Pod, that syncs your git repo into the dag folder every `dags.git.gitSync.refreshTime` seconds.

```yaml
Expand All @@ -605,15 +596,9 @@ kubectl create secret generic \
--namespace airflow
```

> ⚠️ NOTE
>
> specifying `known_hosts` inside `dags.git.secret` reduces the possibility of a man-in-the-middle attack, however, if you want to implicitly trust all repo host signatures set `dags.git.sshKeyscan` to `true`
### Option 2a - PersistentVolume

> ⚠️ WARNING
>
> this method requires a PersistentVolumeClaim which supports `accessMode`: `ReadOnlyMany` or `ReadWriteMany`
> ⚠️️ this method requires a PersistentVolumeClaim which supports `accessMode = ReadOnlyMany or ReadWriteMany`
This method stores your DAGs in a PersistentVolume.

Expand All @@ -640,9 +625,7 @@ dags:

### Option 2b - Shared PersistentVolume

> ⚠️ WARNING
>
> this method requires a PersistentVolumeClaim which supports `accessMode`: `ReadWriteMany`
> ⚠️ this method requires a PersistentVolumeClaim which supports `accessMode = ReadWriteMany`
This method stores both DAGs and logs on the same PersistentVolume.

Expand Down Expand Up @@ -674,13 +657,9 @@ airflow:

## Docs (Other) - requirements.txt

We expose the `dags.installRequirements` value to pip install any `requirements.txt` found at the root of your `dags.path` folder as airflow-worker Pods start.
> ⚠️ if you update the `requirements.txt`, you will have to restart each worker Pod for changes to take effect, you might consider using `airflow.extraPipPackages` instead
> ⚠️ NOTE
>
> if you update the `requirements.txt`, you will have to restart each worker Pod for changes to take effect
>
> you might consider using `airflow.extraPipPackages` instead
We expose the `dags.installRequirements` value to pip install any `requirements.txt` found at the root of your `dags.path` folder as airflow-worker Pods start.

---

Expand Down

0 comments on commit ced3f2b

Please sign in to comment.