You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
Add a configuration option to use S3 as a DAGs storage/provider.
Use case / motivation
Currently Airflow assumes that all DAGs are stored in /some/path/dags folder which is present on a file system of web ui, scheduler and workers components. As a consequence DAGs are tight coupled with Airflow itself what makes the independent deployment of Airflow and DAGs quite complicated. It would be great to have a configuration option which enable Airflow to look for DAGs in a specified S3 bucket.
The text was updated successfully, but these errors were encountered:
Such a feature is not planned. It is recommended to set up a separate process (e.g. sidecar) that will be responsible for file synchronization. However, I would be happy if there was a guide in the documentation for this.
@DmitryRusakovKodiak@ismailsimsek - since you are interested in it - please feel free to open an issue for Helm Chart extension (or even donate one) or re-open a discussion in the devlist, but for now I am closing this one.
Description
Add a configuration option to use S3 as a DAGs storage/provider.
Use case / motivation
Currently Airflow assumes that all DAGs are stored in
/some/path/dags
folder which is present on a file system of web ui, scheduler and workers components. As a consequence DAGs are tight coupled with Airflow itself what makes the independent deployment of Airflow and DAGs quite complicated. It would be great to have a configuration option which enable Airflow to look for DAGs in a specified S3 bucket.The text was updated successfully, but these errors were encountered: