Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

soft_time_limit not working on certain scenarios #3878

Closed
2 tasks done
alilucioboz opened this issue Mar 1, 2017 · 2 comments
Closed
2 tasks done

soft_time_limit not working on certain scenarios #3878

alilucioboz opened this issue Mar 1, 2017 · 2 comments

Comments

@alilucioboz
Copy link

alilucioboz commented Mar 1, 2017

Checklist

  • I have included the output of celery -A proj report in the issue.
    (if you are not able to do this, then at least specify the Celery
    version affected).
software -> celery:4.0.2 (latentcall) kombu:4.0.2 py:2.7.12
            billiard:3.5.0.2 memory:N/A
platform -> system:Darwin arch:64bit imp:CPython
loader   -> celery.loaders.app.AppLoader
settings -> transport:memory results:disabled

broker_url: u'memory://localhost//'
  • I have verified that the issue exists against the master branch of Celery.
> pip freeze
amqp==2.1.4
anyjson==0.3.3
appdirs==1.4.2
billiard==3.5.0.2
boto==2.46.1
celery==4.0.2
kombu==4.0.2
packaging==16.8
psycopg2==2.6.2
pycurl==7.43.0
pyparsing==2.1.10
pytz==2016.10
six==1.10.0
vine==1.1.3

Steps to reproduce

Add the following to tasks.py:

from celery import Celery
from celery.exceptions import SoftTimeLimitExceeded

import psycopg2
import time

app = Celery('tasks', broker='memory://')

@app.task(soft_time_limit=5, time_limit=8)
def check_table():
    print "Starting"
    c = psycopg2.connect(
            host='localhost',
            user='db_user',
            password='db_password',
            dbname='db_name'
        ).cursor()
    
    try:
        c.execute("select * from celery_table")
        print "Done"
    except SoftTimeLimitExceeded:
        print "Caught soft time limit"
        time.sleep(10)

check_table.delay()

There are two problems with soft_time_limit:
Problem 1: is that under certain scenarios when the worker is frozen, it is unable to properly stop the worker.
Problem 2: is that the hard time_limit doesn't take affect if the soft_time_limit has already been exceeded (eg: the table is locked, and the soft_time_limit is exceeded, but nothing happens, and the hard time_limit won't be triggered).

  1. Lock the table: begin; lock table celery_table IN ACCESS EXCLUSIVE MODE;
  2. Start celery: celery -A tasks worker --loglevel=info

Expected behavior

The soft_time_limit should stop the frozen worker. When using time_limit alone, it works fine. Also, the hard time_limit should also stop the worker, but it doesn't.

Actual behavior

The worker remains frozen.

@chenfengyuan
Copy link
Contributor

chenfengyuan commented Jan 4, 2018

celery use signal(USR1) to raise exception in worker, and libpq C(psycopg2) doesn't react to Python signals .
According to this article Cancelling PostgreSQL statements from Python, Just call patch function in your celery worker Initialization code will make soft limit exception be raised in a long running postgresql query.

from select import select
from psycopg2.extensions import POLL_OK, POLL_READ, POLL_WRITE
import psycopg2


def wait_select_inter(conn):
    while True:
        try:
            state = conn.poll()
            if state == POLL_OK:
                break
            elif state == POLL_READ:
                select([conn.fileno()], [], [])
            elif state == POLL_WRITE:
                select([], [conn.fileno()], [])
            else:
                raise conn.OperationalError(
                    "bad state from poll: %s" % state)
        except Exception as e:
            del e
            conn.cancel()
            # the loop will be broken by a server error
            continue


def patch():
    psycopg2.extensions.set_wait_callback(wait_select_inter)

@auvipy
Copy link
Member

auvipy commented Jan 15, 2018

@chenfengyuan thanks for your reply. should we add something to docs? what's your opinion?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants