New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Celery task does not get send to broker #6661
Comments
Hey @Krogsager 👋, We also offer priority support for our sponsors. |
It seems like we can't connect to the broker. |
Hi thedrow. I need some help debugging this - I cannot decipher the source code. I debugged this line in the traceback File "/usr/local/lib/python3.7/socket.py", line 752, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags): and the host and port are '127.0.0.1' and 5672. Why then should the code throw |
Kombu does not seem to be the exact problem. I'm running this simple script and the Broker receives the message, abeit as "Unruteable". from kombu import Connection
from os import getenv
x = getenv('CELERY_BROKER')
conn = Connection(x)
conn.connect()
producer = conn.Producer()
y = producer.publish({'hello':'world'})
print(y.failed)
|
This is strange. |
@thedrow I ran the sample code here and it ran perfectly. Created the from kombu import Connection, Exchange, Queue
from os import getenv
media_exchange = Exchange('media', 'direct', durable=True)
video_queue = Queue('video', exchange=media_exchange, routing_key='video')
def process_media(body, message):
print(body)
message.ack()
# connections
env_broker = getenv('CELERY_BROKER')
print(f"connect to {env_broker}")
with Connection(env_broker) as conn:
# produce
producer = conn.Producer(serializer='json')
producer.publish({'name': '/tmp/lolcat1.avi', 'size': 1301013},
exchange=media_exchange, routing_key='video',
declare=[video_queue])
print("done.") |
In the configuration you supplied in the original post me you're not pointing to localhost. Can you try running that code example in a clean VM? Maybe it's your system that is misconfigured? |
The system configuration is as such: In my original post I wrote I have noticed that the different components interpret the celery broker string differently.
I am sure that the broker is listening because my Kombu tests from yesterday (here and here) are running in container A. Not to mention the Celery worker process.
I will get back to you ASAP. |
That's strange and it may definitely be a bug. |
Yes, I can call a task from container A. ~$ celery call -a "[3]" foo_task
74c69e81-7903-4efe-be02-d864c68756bd |
And from outside the container? |
Can you give an example? |
The correct environment variable is not If there's a mistake in the documentation, please let us know. |
@thedrow I disagree that this issue is invalid: I do not rely on the default variable #celery_tasks.py
# ...import
app = Celery('celery_statst_api')
app.config_from_object(celeryconfig) # import config file and my config file looks like this: #celeryconfig.py
from os import environ
broker = environ.get('CELERY_BROKER', 'default')
result_backend = 'db+postgresql://docker:************@pg_db:5432'
task_serializer= 'json'
result_serializer= 'json'
celery_accept_content = ['json']
# ... |
Why do you need a special environment variable? |
Also, it's 'broker_url' |
That was the issue! A simple misconfiguration. I wrote |
Sometimes I test several systems in the same environment, and therefore I need to separate the run settings. |
Based on my headaches with silent revert to `localhost` I submit this PR. The developer should be notified if their host settings are not found. Details on the issue are here: celery/celery#6661
Based on my headaches with silent revert to `localhost` I submit this PR. The developer should be notified if their host settings are not found. Details on the issue are here: celery/celery#6661
Yes, that would be nice. |
Checklist
master
branch of Celery.contribution guide
on reporting bugs.
for similar or identical bug reports.
for existing proposed fixes.
to find out if the bug was already fixed in the master branch.
in this issue (If there are none, check this box anyway).
Mandatory Debugging Information
celery -A proj report
in the issue.(if you are not able to do this, then at least specify the Celery
version affected).
master
branch of Celery.pip freeze
in the issue.to reproduce this bug.
Optional Debugging Information
and/or upgrading Celery and its dependencies.
Related Issues and Possible Duplicates
Related Issues
#5969
Possible Duplicates
My post
https://stackoverflow.com/questions/66462079/celery-task-does-not-get-send-to-broker
Environment & Settings
Celery version: 5.0.5
celery report
Output:Steps to Reproduce
Required Dependencies
Python Packages
pip freeze
Output:Other Dependencies
N/A
Broker details
Expected Behavior
.delay and .apply_async tasks are send to rabbitmq broker.
Actual Behavior
When I try to send my task to broker (RabbitMQ) it hangs.
If I run the task synchronously it works as expected.
If I interrupt
.apply_async()
with ctrl+c I get a traceback with some clues:The broker connection string looks like this in the system:
The broker connection string in python:
Before you suggest that RabbitMQ is not running, or the connection string is bad; my celery worker (consumer) process is able to connect with the same connection string.
This is how I connect app/producer to the broker.
The file celeryconfig.py contains setup for broker url backend, concurrency, etc.
The text was updated successfully, but these errors were encountered: