Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Dictionary changed size during iteration" in client.PubSub.on_connect() #968

Closed
jyunis opened this issue Apr 17, 2018 · 11 comments
Closed

"Dictionary changed size during iteration" in client.PubSub.on_connect() #968

jyunis opened this issue Apr 17, 2018 · 11 comments
Labels

Comments

@jyunis
Copy link

@jyunis jyunis commented Apr 17, 2018

Hi!

I'm using:
celery==4.1.0
redis==2.10.6

On a fairly busy backend, and I'm seeing occasional 'dictionary changed size during iteration' (full trace below) errors during times of peak load. My guess is that this happens when PubSub.subscribe() and PubSub.on_connect() are called concurrently (thereby changing self.channels as on_connect() iterates over it.

for k, v in iteritems(self.channels):

RuntimeError: dictionary changed size during iteration
File "celery/app/task.py", line 413, in delay
return self.apply_async(args, kwargs)
File "celery/app/task.py", line 536, in apply_async
**options
File "celery/app/base.py", line 736, in send_task
self.backend.on_task_call(P, task_id)
File "celery/backends/redis.py", line 189, in on_task_call
self.result_consumer.consume_from(task_id)
File "celery/backends/redis.py", line 76, in consume_from
self._consume_from(task_id)
File "celery/backends/redis.py", line 82, in _consume_from
self._pubsub.subscribe(key)
File "redis/client.py", line 2482, in subscribe
ret_val = self.execute_command('SUBSCRIBE', *iterkeys(new_channels))
File "redis/client.py", line 2404, in execute_command
self._execute(connection, connection.send_command, *args)
File "redis/client.py", line 2415, in _execute
connection.connect()
File "redis/connection.py", line 502, in connect
callback(self)
File "redis/client.py", line 2374, in on_connect
for k, v in iteritems(self.channels):

@boblatino
Copy link

@boblatino boblatino commented Apr 17, 2018

I am having the same issue on edis==2.10.5 and celery==4.0.0

@amureki
Copy link

@amureki amureki commented Apr 24, 2018

django-redis==4.9.0
redis==2.10.6
celery==4.1.0

recently gave me the same exception during the peak load.

@asgoel
Copy link

@asgoel asgoel commented May 9, 2018

Seeing the same exception:
redis==2.10.5
celery==4.0.2

With high load.

@flyhighplato
Copy link

@flyhighplato flyhighplato commented Jun 19, 2018

Also seeing this:
redis==2.10.5
celery==4.1.0

@sposs
Copy link

@sposs sposs commented Oct 5, 2018

Same with redis 3.2.6, celery 4.2.0

@Gatsby-Lee
Copy link

@Gatsby-Lee Gatsby-Lee commented Nov 2, 2018

redis==2.10.5
celery==4.2.1

@marc1n
Copy link
Contributor

@marc1n marc1n commented Dec 17, 2018

This is probably race condition. PubSub object is not thread safe (as described here). Instance of PubSub class should be used in one thread only or access to it from multi threads must be synchronized.

I think the same rule applies to celery.backends.redis.RedisBackend object.

@glazari
Copy link

@glazari glazari commented Jan 22, 2019

I've been having the same issue

This is probably race condition. PubSub object is not thread safe (as described here). Instance of PubSub class should be used in one thread only or access to it from multi threads must be synchronized.

I think the same rule applies to celery.backends.redis.RedisBackend object.

The error occurs when calling task.apply_async(...)

If it is not thread safe, how should we use it in a flask webapp, or even within a celery worker, which both use threads?

@joaodlf
Copy link

@joaodlf joaodlf commented Apr 7, 2019

Any suggestions on how to tackle this?

@tartieret
Copy link

@tartieret tartieret commented Dec 18, 2019

I am experiencing the same issue with Celery & Redis, very repetitively. Has anyone been able to figure out a work around?

@github-actions
Copy link

@github-actions github-actions bot commented Dec 18, 2020

This issue is marked stale. It will be closed in 30 days if it is not updated.

@github-actions github-actions bot added the Stale label Dec 18, 2020
@github-actions github-actions bot closed this Jan 17, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet