Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add section about local settings configuration. #37829

Merged
Show file tree
Hide file tree
Changes from 4 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ Any extra attributes set by a cluster policy take priority over those defined in
if you set an ``sla`` on your Task in the DAG file, and then your cluster policy also sets an ``sla``, the
cluster policy's value will take precedence.

.. _administration-and-deployment:cluster-policies-define:

How do define a policy function
-------------------------------
Expand All @@ -61,6 +62,10 @@ There are two ways to configure cluster policies:
under your $AIRFLOW_HOME is a good "default" location) and then add callables to the file matching one or more
of the cluster policy names above (e.g. ``dag_policy``).

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.


2. By using a
`setuptools entrypoint <https://packaging.python.org/guides/creating-and-discovering-plugins/#using-package-metadata>`_
in a custom module using the `Pluggy <https://pluggy.readthedocs.io/en/stable/>`_ interface.
Expand Down Expand Up @@ -163,6 +168,9 @@ For example, your ``airflow_local_settings.py`` might follow this pattern:
:start-after: [START example_list_of_cluster_policy_rules]
:end-before: [END example_list_of_cluster_policy_rules]

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.


Task instance mutation
~~~~~~~~~~~~~~~~~~~~~~
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -116,3 +116,6 @@ define a ``json`` variable in local Airflow settings (``airflow_local_settings.p
import ujson

json = ujson

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,8 @@ KubernetesPodOperator
The :ref:`KubernetesPodOperator <howto/operator:kubernetespodoperator>` allows you to create
Pods on Kubernetes.

.. _kubernetes:pod_mutation_hook:

Pod Mutation Hook
^^^^^^^^^^^^^^^^^

Expand All @@ -52,6 +54,9 @@ are expected to alter its attributes.
This could be used, for instance, to add sidecar or init containers
to every worker pod launched by KubernetesExecutor or KubernetesPodOperator.

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.


.. code-block:: python

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,16 @@ be configured by providing custom logging configuration object. You can also cre
for specific operators and tasks.

Some configuration options require that the logging config class be overwritten. You can do it by copying the default
configuration of Airflow and modifying it to suit your needs. The default configuration can be seen in the
configuration of Airflow and modifying it to suit your needs.

The default configuration can be seen in the
`airflow_local_settings.py template <https://github.com/apache/airflow/blob/|airflow-version|/airflow/config_templates/airflow_local_settings.py>`_
and you can see the loggers and handlers used there. Except the custom loggers and handlers configurable there
via the ``airflow.cfg``, the logging methods in Airflow follow the usual Python logging convention,
and you can see the loggers and handlers used there.

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.

Except the custom loggers and handlers configurable there via the ``airflow.cfg``, the logging methods in Airflow follow the usual Python logging convention,
that Python objects log to loggers that follow naming convention of ``<package>.<module_name>``.

You can read more about standard python logging classes (Loggers, Handlers, Formatters) in the
Expand Down
7 changes: 7 additions & 0 deletions docs/apache-airflow/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,8 @@ How do I trigger tasks based on another task's failure?

You can achieve this with :ref:`concepts:trigger-rules`.

.. _faq:how-to-control-dag-file-parsing-timeout:

How to control DAG file parsing timeout for different DAG files?
----------------------------------------------------------------

Expand Down Expand Up @@ -148,6 +150,11 @@ When the return value is less than or equal to 0, it means no timeout during the
return conf.getfloat("core", "DAGBAG_IMPORT_TIMEOUT")


See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.



When there are a lot (>1000) of DAG files, how to speed up parsing of new files?
---------------------------------------------------------------------------------

Expand Down
6 changes: 6 additions & 0 deletions docs/apache-airflow/howto/customize-ui.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@
Customizing the UI
==================

.. _customizing-the-ui:

Customizing state colours
-------------------------

Expand All @@ -30,6 +32,10 @@ following steps:
to ``$AIRFLOW_HOME/config`` folder. (Airflow adds ``$AIRFLOW_HOME/config`` on ``PYTHONPATH`` when
Airflow is initialized)

See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.


2. Add the following contents to ``airflow_local_settings.py`` file. Change the colors to whatever you
would like.

Expand Down
6 changes: 5 additions & 1 deletion docs/apache-airflow/howto/export-more-env-vars.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
under the License.



.. _export_dynamic_environment_variables:

Export dynamic environment variables available for operators to use
===================================================================
Expand Down Expand Up @@ -50,3 +50,7 @@ In your ``airflow_local_settings.py`` file.
"""
# more env vars
return {"airflow_cluster": "main"}


See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.
28 changes: 28 additions & 0 deletions docs/apache-airflow/howto/set-config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -167,6 +167,34 @@ the example below.
that you run airflow components on is synchronized (for example using ntpd) otherwise you might get
"forbidden" errors when the logs are accessed.

.. _set-config:configuring-local-settings:

Configuring local settings
==========================

Some Airflow configuration is configured via local setting, because they require changes in the
code that is executed when Airflow is initialized. Usually It is mentioned in the detailed documentation
potiuk marked this conversation as resolved.
Show resolved Hide resolved
where you can configure such local settings - This is usually done in the ``airflow_local_settings.py`` file.

You should create a ``airflow_local_settings.py`` file and put it in a directory in ``sys.path`` or
in the ``$AIRFLOW_HOME/config`` folder. (Airflow adds ``$AIRFLOW_HOME/config`` to ``sys.path`` when
Airflow is initialized)

You can see the example of such local settings here:

.. py:module:: airflow.config_templates.airflow_local_settings

Example settings you can configure this way:

* :ref:`Cluster Policies <administration-and-deployment:cluster-policies-define>`
* :ref:`Advanced logging configuration <write-logs-advanced>`
* :ref:`Dag serialization <dag-serialization>`
* :ref:`Pod mutation hook in Kubernetes Executor<kubernetes:pod_mutation_hook>`
* :ref:`Control DAG parsing time <faq:how-to-control-dag-file-parsing-timeout>`
* :ref:`Customize your UI <customizing-the-ui>`
* :ref:`Configure more variables to export <export_dynamic_environment_variables>`
* :ref:`Customize your DB configuration <set-up-database-backend>`


Configuring Flask Application for Airflow Webserver
===================================================
Expand Down
6 changes: 5 additions & 1 deletion docs/apache-airflow/howto/set-up-database.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
specific language governing permissions and limitations
under the License.


.. _set-up-database-backend:

Set up a Database Backend
=========================
Expand Down Expand Up @@ -261,6 +261,10 @@ For more information regarding setup of the PostgreSQL connection, see `PostgreS
sql_alchemy_connect_args = airflow_local_settings.keepalive_kwargs


See :ref:`Configuring local settings <set-config:configuring-local-settings>` for details on how to
configure local settings.



.. spelling:word-list::

Expand Down