Skip to content

Commit

Permalink
[AIRFLOW-XXXX] Adjust celery defaults to work with breeze
Browse files Browse the repository at this point in the history
  • Loading branch information
turbaszek committed Jan 18, 2020
1 parent 417feda commit e91ecbe
Show file tree
Hide file tree
Showing 3 changed files with 7 additions and 7 deletions.
6 changes: 3 additions & 3 deletions airflow/config_templates/config.yml
Expand Up @@ -990,7 +990,7 @@
version_added: ~
type: string
example: ~
default: "16"
default: "8"
- name: worker_autoscale
description: |
The maximum and minimum concurrency that will be used when starting workers with the
Expand Down Expand Up @@ -1023,7 +1023,7 @@
version_added: ~
type: string
example: ~
default: "sqla+mysql://airflow:airflow@localhost:3306/airflow"
default: "redis://redis:6379/0"
- name: result_backend
description: |
The Celery result_backend. When a job finishes, it needs to update the
Expand All @@ -1035,7 +1035,7 @@
version_added: ~
type: string
example: ~
default: "db+mysql://airflow:airflow@localhost:3306/airflow"
default: "db+postgresql://postgres:airflow@postgres/airflow"
- name: flower_host
description: |
Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start
Expand Down
6 changes: 3 additions & 3 deletions airflow/config_templates/default_airflow.cfg
Expand Up @@ -468,7 +468,7 @@ celery_app_name = airflow.executors.celery_executor
# ``airflow celery worker`` command. This defines the number of task instances that
# a worker will take, so size up your workers based on the resources on
# your worker box and the nature of your tasks
worker_concurrency = 16
worker_concurrency = 8

# The maximum and minimum concurrency that will be used when starting workers with the
# ``airflow celery worker`` command (always keep minimum processes, but grow
Expand All @@ -490,15 +490,15 @@ worker_log_server_port = 8793
# a sqlalchemy database. Refer to the Celery documentation for more
# information.
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#broker-settings
broker_url = sqla+mysql://airflow:airflow@localhost:3306/airflow
broker_url = redis://redis:6379/0

# The Celery result_backend. When a job finishes, it needs to update the
# metadata of the job. Therefore it will post a message on a message bus,
# or insert it into a database (depending of the backend)
# This status is used by the scheduler to update the state of the task
# The use of a database is highly recommended
# http://docs.celeryproject.org/en/latest/userguide/configuration.html#task-result-backend-settings
result_backend = db+mysql://airflow:airflow@localhost:3306/airflow
result_backend = db+postgresql://postgres:airflow@postgres/airflow

# Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start
# it ``airflow celery flower``. This defines the IP that Celery Flower runs on
Expand Down
2 changes: 1 addition & 1 deletion airflow/config_templates/default_celery.py
Expand Up @@ -40,7 +40,7 @@ def _broker_supports_visibility_timeout(url):
broker_transport_options['visibility_timeout'] = 21600

DEFAULT_CELERY_CONFIG = {
'accept_content': ['json', 'pickle'],
'accept_content': ['json'],
'event_serializer': 'json',
'worker_prefetch_multiplier': 1,
'task_acks_late': True,
Expand Down

0 comments on commit e91ecbe

Please sign in to comment.