Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kombu 4.1.0 - Memory usage increase (leak?) on a worker when using kombu queues #844

Closed
sradhakrishna opened this issue Apr 6, 2018 · 13 comments
Milestone

Comments

@sradhakrishna
Copy link
Contributor

sradhakrishna commented Apr 6, 2018

Hi,

I have implemented a worker using Kombu's SimpleQueue. The implementation is as given below. When I run this worker for a few hours on a Ubuntu 16.04 system with redis as the backend, I notice a gradual memory build up on the process. When I run this worker for over a day, it ends up consuming all memory on the system and the system ends up being unusable, until the worker is killed.

On redis server, I have it configured with a timeout set to 5 seconds and a tcp_keepalive set to 60 seconds.

Worker Code:

from kombu import Connection

myqueue_name = 'test_queue'
backendURL = 'redis://127.0.0.1:6379/'

def GetConnection():
    conn = Connection(backendURL)

    return conn

def dequeue():
    conn = GetConnection()
    with conn:
        myqueue = conn.SimpleQueue(myqueue_name)

        item = None

        try:
            qItem = myqueue.get(block=True, timeout=2)
            item = qItem.payload
            qItem.ack()
        except Exception as e:
            qItem = None

        myqueue.close()

    conn.close()
    conn.release()

    return item

if __name__ == '__main__':
    try:
        i = 1
        while True:
            print 'Iteration %s: %s' % (i, dequeue())
            i = i + 1
    except (KeyboardInterrupt, SystemExit):
        print 'Terminating'

Here's a plot of free memory on the system:

image

What is going wrong here? Did I miss anything in the implementation?

Any help here will be greatly appreciated.

@auvipy
Copy link
Member

auvipy commented Apr 7, 2018

could you try the 4.2 version from master and check if anything improve?

@sradhakrishna
Copy link
Contributor Author

Same behavior with 4.2 from master as well. Actually, it seems worse - memory is piling up rather quick.

Am I missing anything in the code above, to kick off memory release from objects, connections/ queues etc?

@auvipy
Copy link
Member

auvipy commented Apr 7, 2018

ask your question to mailing list/irc referencing the issue

@sradhakrishna
Copy link
Contributor Author

I am trying to post the question on the google groups mailing list - somehow, the post doesn't show up.

@auvipy Can you help confirm this behavior?
@everyone, any suggestions/ pointers please?

@sradhakrishna
Copy link
Contributor Author

Addressing this issue is quite important to me - it's pretty much lines down scenario for my usecase.

Switching away from kombu to something else requires quite some time so don't seem to have many options other than fixing this.

Any way I can get attention of the community to this issue?

Any help in addressing this issue will be greatly appreciated.

@auvipy
Copy link
Member

auvipy commented Apr 9, 2018

could you try rabbitmq? that might be a temporary solution

@sradhakrishna
Copy link
Contributor Author

I've tried the same with rabbitmq as the backend. Saw the same behavior in that scenario too. Seems that the issue might not be in the backend specific implementation.

@auvipy
Copy link
Member

auvipy commented Apr 9, 2018

try to find out thee leak, also I would suggest to install all the dependencies of celery from master branch to verify on master

@aviperetz34
Copy link

I am experiencing the same issue. the issue disappears when you don't use timeout and keep blocking.

@auvipy auvipy added this to the 5.3 milestone Sep 12, 2021
@auvipy
Copy link
Member

auvipy commented Dec 22, 2021

@pawl

@pawl
Copy link
Contributor

pawl commented Dec 22, 2021

@auvipy Nice find, it seems like this could definitely be related to: celery/celery#4843 (comment)

@pawl
Copy link
Contributor

pawl commented Dec 24, 2021

This may be fixed by: #1476

@auvipy
Copy link
Member

auvipy commented Dec 24, 2021

ok merged, lets see

@auvipy auvipy modified the milestones: 5.3, 5.2.x Dec 24, 2021
@auvipy auvipy modified the milestones: 5.2.x, 5.3 Apr 17, 2022
@auvipy auvipy closed this as completed Apr 17, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants