Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug relating to kombu 4.6.0 with determining active tasks #1051

Closed
castrapel opened this issue May 30, 2019 · 5 comments
Closed

Bug relating to kombu 4.6.0 with determining active tasks #1051

castrapel opened this issue May 30, 2019 · 5 comments
Milestone

Comments

@castrapel
Copy link

castrapel commented May 30, 2019

The code below works in Kombu 4.5.0 but not 4.6.0. The problem is with Celery's inspect object. active_tasks = i.active() <-- This always seems to be returning 'None' with 4.6.0

In the past, we would receive a list of the active tasks.

def is_task_active(fun, task_id, args):
    from celery.task.control import inspect
    i = inspect()
    active_tasks = i.active()
    for _, tasks in active_tasks.items():
        for task in tasks:
            if task.get("id") == task_id:
                continue
            if task.get("name") == fun and task.get("args") == str(args):
                return True
    return False

FWIW we are seeing this issue on Python 3.7 with a service called Lemur (You can see the dependencies in the requirements files here: https://github.com/Netflix/lemur)

In 4.5.0, i.active() returns a dict with celery worker hosts as the keys, and a list of their tasks as values.

<class 'dict'>: {'celery@worker-host': [{'id': '1e8743d6-8375-453a-83e2-9cc4af1415ae', 'name': 'lemur.common.celery.fetch_all_pending_acme_certs', 'args': '()', 'kwargs': '{}', 'type': 'lemur.common.celery.fetch_all_pending_acme_certs', 'hostname': 'celery@worker-host, 'time_start': 1559248134.1262424, 'acknowledged': True, 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'worker_pid': 87281}, {'id': '52149d6e-6a81-4368-8ddb-dcad01585fd3', 'name': 'lemur.common.celery.fetch_acme_cert', 'args': '(669,)', 'kwargs': '{}', 'type': 'lemur.common.celery.fetch_acme_cert', 'hostname': 'celery@worker-host', 'time_start': 1559248137.8084652, 'acknowledged': True, 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'worker_pid': 87284}]}

@auvipy
Copy link
Member

auvipy commented May 31, 2019

So celery is facing bug with this right? could you try celery master with kombu 4.6? and inpossible try to track the root of regression?

@auvipy auvipy added this to the 4.6.0 milestone May 31, 2019
@dsalisbury
Copy link

@auvipy I'm pretty sure I've got celery master and kombu 4.6.0 installed, and I can recreate this in my application to. I haven't tried to distill it to a small reproducible example, but I can reproduce it all the same

@dsalisbury
Copy link

I have it recreated in isolation.

Sample script:

$ cat tasks.py
from celery import Celery
from celery.signals import celeryd_after_setup


@celeryd_after_setup.connect
def setuplog(*a, **k):
    import logging

    logging.getLogger("kombu").setLevel(logging.DEBUG)
    logging.getLogger("celery").setLevel(logging.DEBUG)


app = Celery(broker="redis://localhost/2")


@app.task
def hi():
    return "hi"

Environment setup:

$ docker run --rm -d -p 6379:6379 redis
... container id ...
$ python3 -m venv e
$ git clone https://github.com/celery/celery.git src/celery
Cloning into 'src/celery'...
remote: Enumerating objects: 85556, done.
remote: Total 85556 (delta 0), reused 0 (delta 0), pack-reused 85556
Receiving objects: 100% (85556/85556), 28.03 MiB | 1.52 MiB/s, done.
Resolving deltas: 100% (64182/64182), done.
$ e/bin/pip3 install -e src/celery
Obtaining file:///Users/dsalisbury/src/scratch/kombu-docker-thing/src/celery
  Installing build dependencies ... done
Collecting pytz>dev (from celery==4.3.0)
  Using cached https://files.pythonhosted.org/packages/3d/73/fe30c2daaaa0713420d0382b16fbb761409f532c56bdcc514bf7b6262bb6/pytz-2019.1-py2.py3-none-any.whl
Collecting billiard<4.0,>=3.6.0 (from celery==4.3.0)
Collecting kombu<5.0,>=4.6.0 (from celery==4.3.0)
  Using cached https://files.pythonhosted.org/packages/e7/b1/19cc103250276cf4319f8e4246afb307886ce676d5c06dab0680c6d3e3f9/kombu-4.6.0-py2.py3-none-any.whl
Collecting vine==1.3.0 (from celery==4.3.0)
  Using cached https://files.pythonhosted.org/packages/7f/60/82c03047396126c8331ceb64da1dc52d4f1317209f32e8fe286d0c07365a/vine-1.3.0-py2.py3-none-any.whl
Collecting amqp<3.0,>=2.5.0 (from kombu<5.0,>=4.6.0->celery==4.3.0)
  Using cached https://files.pythonhosted.org/packages/45/91/877ea2d1c3234dbb32e818d281615232468ff8a4a783a8fc264862b9f77b/amqp-2.5.0-py2.py3-none-any.whl
Installing collected packages: pytz, billiard, vine, amqp, kombu, celery
  Running setup.py develop for celery
Successfully installed amqp-2.5.0 billiard-3.6.0.0 celery kombu-4.6.0 pytz-2019.1 vine-1.3.0
You are using pip version 18.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$ e/bin/pip3 install redis
Collecting redis
  Using cached https://files.pythonhosted.org/packages/ac/a7/cff10cc5f1180834a3ed564d148fb4329c989cbb1f2e196fc9a10fa07072/redis-3.2.1-py2.py3-none-any.whl
Installing collected packages: redis
Successfully installed redis-3.2.1
You are using pip version 18.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
$ e/bin/pip3 install kombu==4.6.0
Requirement already satisfied: kombu==4.6.0 in ./e/lib/python3.7/site-packages (4.6.0)
Requirement already satisfied: amqp<3.0,>=2.5.0 in ./e/lib/python3.7/site-packages (from kombu==4.6.0) (2.5.0)
Requirement already satisfied: vine<5.0.0a1,>=1.1.3 in ./e/lib/python3.7/site-packages (from amqp<3.0,>=2.5.0->kombu==4.6.0) (1.3.0)
You are using pip version 18.1, however version 19.1.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.

Running the worker:

$ e/bin/celery -A tasks worker --loglevel=DEBUG
[2019-06-05 13:21:58,024: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2019-06-05 13:21:58,029: DEBUG/MainProcess] | Worker: Building graph...
[2019-06-05 13:21:58,029: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Pool, Autoscaler, Beat, StateDB, Consumer}
[2019-06-05 13:21:58,076: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2019-06-05 13:21:58,076: DEBUG/MainProcess] | Consumer: Building graph...
[2019-06-05 13:21:58,110: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Heart, Mingle, Tasks, Control, Agent, Gossip, event loop}

 -------------- celery@C02WW2AAHV2T v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Darwin-18.6.0-x86_64-i386-64bit 2019-06-05 13:21:58
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         __main__:0x1111ce550
- ** ---------- .> transport:   redis://localhost:6379/2
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . celery.accumulate
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . tasks.hi

[2019-06-05 13:21:58,138: DEBUG/MainProcess] | Worker: Starting Hub
[2019-06-05 13:21:58,138: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:58,138: DEBUG/MainProcess] | Worker: Starting Pool
[2019-06-05 13:21:58,213: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:58,213: DEBUG/MainProcess] | Worker: Starting Consumer
[2019-06-05 13:21:58,213: DEBUG/MainProcess] | Consumer: Starting Connection
[2019-06-05 13:21:58,239: INFO/MainProcess] Connected to redis://localhost:6379/2
[2019-06-05 13:21:58,239: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:58,240: DEBUG/MainProcess] | Consumer: Starting Events
[2019-06-05 13:21:58,249: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:58,249: DEBUG/MainProcess] | Consumer: Starting Heart
[2019-06-05 13:21:58,253: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:58,253: DEBUG/MainProcess] | Consumer: Starting Mingle
[2019-06-05 13:21:58,253: INFO/MainProcess] mingle: searching for neighbors
[2019-06-05 13:21:59,282: INFO/MainProcess] mingle: all alone
[2019-06-05 13:21:59,282: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:59,283: DEBUG/MainProcess] | Consumer: Starting Tasks
[2019-06-05 13:21:59,294: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:59,294: DEBUG/MainProcess] | Consumer: Starting Control
[2019-06-05 13:21:59,306: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:59,306: DEBUG/MainProcess] | Consumer: Starting Gossip
[2019-06-05 13:21:59,318: DEBUG/MainProcess] ^-- substep ok
[2019-06-05 13:21:59,318: DEBUG/MainProcess] | Consumer: Starting event loop
[2019-06-05 13:21:59,318: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2019-06-05 13:21:59,319: INFO/MainProcess] celery@C02WW2AAHV2T ready.
[2019-06-05 13:21:59,319: DEBUG/MainProcess] basic.qos: prefetch_count->16


### AT THIS POINT I RUN THE FOLLOWING AS SHOWN BELOW
### $ e/bin/celery -A tasks inspect ping

[2019-06-05 13:22:18,713: DEBUG/MainProcess] pidbox received method ping() [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': '24f87c4e-c4ef-3f95-b4ca-40eef76fad77'} ticket:8665bfc7-db0d-466a-a4bf-c5479ddc1649]

### NOTHING HAPPENS

### AT THIS POINT I RUN THE FOLLOWING AS SHOWN BELOW
### $ e/bin/celery -A tasks inspect report

[2019-06-05 13:22:22,176: DEBUG/MainProcess] pidbox received method registered(taskinfoitems=[]) [reply_to:{'exchange': 'reply.celery.pidbox', 'routing_key': 'ba42179e-90ce-35b6-97b8-a332433720ab'} ticket:b0dafa8f-3734-4fa4-af07-f9611f677235]

### NOTHING HAPPENS


### AT THIS POINT I RUN THE FOLLOWING AS SHOWN BELOW
### $ e/bin/celery -A tasks call tasks.hi

[2019-06-05 13:22:54,277: INFO/MainProcess] Received task: tasks.hi[9c2f5ec8-659e-49c0-b827-1f51c712499b]
[2019-06-05 13:22:54,278: DEBUG/MainProcess] TaskPool: Apply <function _fast_trace_task at 0x1114d4d90> (args:('tasks.hi', '9c2f5ec8-659e-49c0-b827-1f51c712499b', {'lang': 'py', 'task': 'tasks.hi', 'id': '9c2f5ec8-659e-49c0-b827-1f51c712499b', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '9c2f5ec8-659e-49c0-b827-1f51c712499b', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': '{}', 'origin': 'gen27544@C02WW2AAHV2T', 'reply_to': '721b6fae-d736-3cdd-8373-0b21b48bf143', 'correlation_id': '9c2f5ec8-659e-49c0-b827-1f51c712499b', 'hostname': 'celery@C02WW2AAHV2T', 'delivery_info': {'exchange': '', 'routing_key': 'celery', 'priority': 0, 'redelivered': None}, 'args': [],'kwargs': {}}, b'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]', 'application/json', 'utf-8') kwargs:{})
[2019-06-05 13:22:54,280: DEBUG/MainProcess] Task accepted: tasks.hi[9c2f5ec8-659e-49c0-b827-1f51c712499b] pid:26963
[2019-06-05 13:22:54,281: INFO/ForkPoolWorker-2] Task tasks.hi[9c2f5ec8-659e-49c0-b827-1f51c712499b] succeeded in 0.0005218829999975583s: 'hi'

Me trying to inspect unsuccessfully, then successfully running a task:

$ e/bin/celery -A tasks inspect ping
Error: No nodes replied within time constraint.
$ e/bin/celery -A tasks inspect registered
Error: No nodes replied within time constraint.
$ e/bin/celery -A tasks call tasks.hi
9c2f5ec8-659e-49c0-b827-1f51c712499b
$ e/bin/celery -A tasks report
software -> celery:4.3.0 (rhubarb) kombu:4.6.0 py:3.7.2
            billiard:3.6.0.0 redis:3.2.1
platform -> system:Darwin arch:64bit
            kernel version:18.6.0 imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:redis results:disabled

broker_url: 'redis://localhost:6379/2'

@FauxFaux
Copy link

FauxFaux commented Jun 6, 2019

(This is just a +1!) We have seen this in our product, too; pretty much the same as above, with ping failing to find anything. Also Python3.7, linux and mac. Redis backend. Pinning kombu <4.6 has resolved it.

@browniebroke
Copy link
Contributor

This is probably resolved on the master branch, the thing that caused #1052 was reverted.

@auvipy auvipy closed this as completed Jun 12, 2019
hosseinsh added a commit to hosseinsh/lemur that referenced this issue Jan 3, 2020
removing this pin, since the issue is now resolved.
kombu<4.6.0 # Bug with inspecting active tasks: celery/kombu#1051
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants