Skip to content

Commit

Permalink
add kafka broker docs (#8935)
Browse files Browse the repository at this point in the history
* add kafka broker docs

* modify config options to be more accurate

* add additional documentation on findings

* update config and add limitations

* sasl
  • Loading branch information
thuibr committed May 1, 2024
1 parent 90ff2e1 commit 5386e3e
Show file tree
Hide file tree
Showing 2 changed files with 85 additions and 0 deletions.
3 changes: 3 additions & 0 deletions docs/getting-started/backends-and-brokers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Broker Instructions
rabbitmq
redis
sqs
kafka

.. _broker-overview:

Expand All @@ -41,6 +42,8 @@ individual transport (see :ref:`broker_toc`).
+---------------+--------------+----------------+--------------------+
| *Zookeeper* | Experimental | No | No |
+---------------+--------------+----------------+--------------------+
| *Kafka* | Experimental | No | No |
+---------------+--------------+----------------+--------------------+

Experimental brokers may be functional but they don't have
dedicated maintainers.
Expand Down
82 changes: 82 additions & 0 deletions docs/getting-started/backends-and-brokers/kafka.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
.. _broker-kafka:

=============
Using Kafka
=============

.. _broker-Kafka-installation:

Configuration
=============

For celeryconfig.py:

.. code-block:: python
import os
task_serializer = 'json'
broker_transport_options = {
# "allow_create_topics": True,
}
broker_connection_retry_on_startup = True
# For using SQLAlchemy as the backend
# result_backend = 'db+postgresql://postgres:example@localhost/postgres'
broker_transport_options.update({
"security_protocol": "SASL_SSL",
"sasl_mechanism": "SCRAM-SHA-512",
})
sasl_username = os.environ["SASL_USERNAME"]
sasl_password = os.environ["SASL_PASSWORD"]
broker_url = f"confluentkafka://{sasl_username}:{sasl_password}@broker:9094"
kafka_admin_config = {
"sasl.username": sasl_username,
"sasl.password": sasl_password,
}
kafka_common_config = {
"sasl.username": sasl_username,
"sasl.password": sasl_password,
"security.protocol": "SASL_SSL",
"sasl.mechanism": "SCRAM-SHA-512",
"bootstrap_servers": "broker:9094",
}
Please note that "allow_create_topics" is needed if the topic does not exist
yet but is not necessary otherwise.

For tasks.py:

.. code-block:: python
from celery import Celery
app = Celery('tasks')
app.config_from_object('celeryconfig')
@app.task
def add(x, y):
return x + y
Auth
====

See above. The SASL username and password are passed in as environment variables.

Further Info
============

Celery queues get routed to Kafka topics. For example, if a queue is named "add_queue",
then a topic named "add_queue" will be created/used in Kafka.

For canvas, when using a backend that supports it, the typical mechanisms like
chain, group, and chord seem to work.


Limitations
===========

Currently, using Kafka as a broker means that only one worker can be used.
See https://github.com/celery/kombu/issues/1785.

0 comments on commit 5386e3e

Please sign in to comment.