Skip to content

Infinite loop while using queues for sentry (Received unregistered task of type 'sentry.tasks.store.store_event') #611

Closed
abhinavsingh opened this Issue Sep 16, 2012 · 7 comments

2 participants

@abhinavsingh

Hi,

I am configuring sentry for one of my projects. Yesterday I upgraded to sentry 5.0.11.1 and I can see an infinite loop of tasks are being received in the background while using SENTRY_USE_QUEUE = True. Everything works as expected if not using queues for sentry logging.

Here are details of my setup:

Inside sentry.conf.db I have enabled queues:

SENTRY_USE_QUEUE = True
CELERY_IGNORE_RESULT = True
CELERY_SEND_EVENTS = False
CELERY_RESULT_BACKEND = 'amqp'
BROKER_URL = "amqp://guest:guest@localhost:5672"

Somewhere around the top of my django project settings.py, I have following settings:

DEBUG = True
TEMPLATE_DEBUG = DEBUG

djcelery.setup_loader()
BROKER_URL = 'amqp://guest:guest@localhost:5672/'

SENTRY_DSN = "http://0de4f34df4dc4b1a8d081f6bd10872c3:cb12b0c371b34f75b6d941606348684a@logger." + DOMAIN + "/2"
RAVEN_CONFIG = {
    'dsn': SENTRY_DSN,
    'register_signals': True,
}

I also have following middleware class enabled:
'raven.contrib.django.middleware.Sentry404CatchMiddleware',

and following in my list of installed apps:
'raven.contrib.django',

I have started sentry:
sentry --config=deps/sentry/sentry.conf.py start

and celeryd for letting sentry use queues:
sentry --config=deps/sentry/sentry.conf.py celeryd --loglevel=debug

Finally, I also have celeryd running for my django application:
./manage.py celeryd worker --loglevel=debug

Now if I hit a 404, both the celeryd workers are going into an infinite loop of logging events and also sentry server takes the necessary hit.

Here is a cut/copy/paste from sentry dashboard:

    celery.worker.consumer.on_task_received

    celery.worker.consumer �[1;31mReceived unregistered task of type 'sentry.tasks.store.store_event'. The message has been ...
    3 minutes ago

    Details
    Events
    Browsers
    Operating Systems
    Servers
    URLs
    User Emails

�[1;31mReceived unregistered task of type 'sentry.tasks.store.store_event'.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you are using relative imports?
Please see http://bit.ly/gLye1c for more information.

The full contents of the message body was:
{'retries': 0, 'task': 'sentry.tasks.store.store_event', 'eta': None, 'args': [], 'expires': None, 'callbacks': None, 'errbacks': None, 'kwargs': {'data': {'time_spent': None, 'sentry.interfaces.Exception': {'type': 'KeyError', 'value': "'sentry.tasks.process_buffer.process_incr'", 'module': 'exceptions'}, 'server_name': 'abhinavsingh.local', 'culprit': 'celery.worker.consumer.on_task_received', 'level': 40, 'event_id': '21c74b652f484652b7276a15ee67046d', 'checksum': 'c69e321a865148d614ea58f54827290a', 'extra': {'thread': 140735164033376, 'process': 88061, 'threadName': 'MainThread', 'module': 'consumer', 'filename': 'consumer.py', 'processName': 'MainProcess', 'pathname': '/Library/Python/2.7/site-packages/celery/worker/consumer.py', 'lineno': 540, 'asctime': '2012-09-16 18:34:10,927', 'message': "\x1b[1;31mReceived unregistered task of type 'sentry.tasks.process_buffer.process_incr'.\nThe message has been ignored and discarded.\n\nDid you remember to import the module containing this task?\nOr maybe you... (5501b)
�[0m

Raw
Stacktrace

    celery/worker/consumer.py in on_task_received

                        except (KeyError, TypeError):

                            return self.handle_unknown_message(body, message)

                        try:

                            strategies[name](message, body, message.ack_log_error)

                        except KeyError, exc:

                            self.handle_unknown_task(body, message, exc)

    ▼ Local vars
    body    

    {'args': (),
     'callbacks': None,
     'errbacks': None,
     'eta': None,
     'expires': None,
     'id': '7edc013f-efd4-4b69-882f-9647fd8dbf4a',
     'kwargs': {'data': {'checksum': 'c69e321a865148d614ea58f54827290a',
                         'culprit': 'celery.worker.consumer.on_task_received',
                         'event_id': 'cf9f1343b4a64a458af54ae46857bb22',
                         'extra': {'asctime': <type 'str'>,
                                   'filename': <type 'str'>,
                                   'funcName': <type 'str'>,
                                   'lineno': <type 'int'>,
                                   'message': <type 'str'>,
                                   'module': <type 'str'>,
                                   'pathname': <type 'str'>,
                                   'process': <type 'int'>,
                                   'processName': <type 'str'>,
                                   'thread': <type 'int'>,
                                   'threadName': <type 'str'>},
                         'level': 40,
                         'logger': 'celery.worker.consumer',
                         'message': "\x1b[1;31mReceived unregistered task of type 'sentry.tasks.store.store_event'.\nThe message has been ign",
                         'modules': {'debug_toolbar': <type 'str'>,
                                     'django': <type 'str'>,
                                     'django_jenkins': <type 'str'>,
                                     'djangoratings': <type 'str'>,
                                     'djcelery': <type 'str'>,
                                     'gunicorn': <type 'str'>,
                                     'messages': <type 'str'>,
                                     'notification': <type 'str'>,
                                     'raven': <type 'str'>,
                                     'registration': <type 'str'>,
                                     'reversion': <type 'str'>,
                                     'social_auth': <type 'str'>},
                         'project': 2,
                         'sentry.interfaces.Exception': {'module': <type 'str'>,
                                                         'type': <type 'str'>,
                                                         'value': <type 'str'>},
                         'sentry.interfaces.Message': {'message': <type 'str'>,
                                                       'params': <type 'list'>},
                         'sentry.interfaces.Stacktrace': {'frames': <type 'list'>},
                         'server_name': 'abhinavsingh.local',
                         'site': None,
                         'tags': None,
                         'time_spent': None,
                         'timestamp': 'datetime.datetime(2012, 9, 16, 18, 34, 13)'}},
     'retries': 0,
     'task': 'sentry.tasks.store.store_event',
     'utc': True}

    callback    

    '<bound method ResultHandler.handle_event of <ResultHandler(Thread-6, initial daemon)>>'

    exc     

    "KeyError('sentry.tasks.store.store_event',)"

    message     

    '<kombu.transport.amqplib.Message object at 0x107fbccc8>'

    name    

    'sentry.tasks.store.store_event'

    on_task_callbacks   

    ('<bound method ResultHandler.handle_event of <ResultHandler(Thread-6, initial daemon)>>',)

    self    

    '<celery.worker.consumer.Consumer object at 0x107fc9290>'

    strategies  

    {'celery.backend_cleanup': '<function task_message_handler at 0x1080161b8>',
     'celery.chain': '<function task_message_handler at 0x107ff7c80>',
     'celery.chord': '<function task_message_handler at 0x10800fb90>',
     'celery.chord_unlock': '<function task_message_handler at 0x10800fd70>',
     'celery.chunks': '<function task_message_handler at 0x10800fe60>',
     'celery.group': '<function task_message_handler at 0x1080160c8>',
     'celery.map': '<function task_message_handler at 0x1080162a8>',
     'celery.starmap': '<function task_message_handler at 0x108016488>',
     'events.tasks.add_event': '<function task_message_handler at 0x10800ff50>',
     'events.tasks.reindex_event': '<function task_message_handler at 0x10800fc80>',
     'raven.contrib.celery.send_raw': '<function task_message_handler at 0x108016398>'}

Exception
Types:  

KeyError

Value:  

'sentry.tasks.store.store_event'

Location:   celery/worker/consumer.py, on_task_received:411
Additional Data
asctime     

'2012-09-16 18:34:14,971'

filename    

'consumer.py'

funcName    

'handle_unknown_task'

lineno  

540

message     

"\x1b[1;31mReceived unregistered task of type 'sentry.tasks.store.store_event'.\nThe message has been ignored and discarded.\n\nDid you remember to import the module containing this task?\nOr maybe you are using relative imports?\nPlease see http://bit.ly/gLye1c for more information.\n\nThe full contents of the message body was:\n{'retries': 0, 'task': 'sentry.tasks.store.store_event', 'eta': None, 'args': []..."

module  

'consumer'

pathname    

'/Library/Python/2.7/site-packages/celery/worker/consumer.py'

process     

88061

processName     

'MainProcess'

thread  

140735164033376

threadName  

'MainThread'

Package Versions
debug_toolbar   0.9.4
django  1.4.1
django_jenkins  0.12.1
djangoratings   0.3.6
djcelery    3.0.9
gunicorn    0.14.6
messages    0.4.4
notification    0.2
raven   2.0.6
registration    0.8
reversion   1.6.3
social_auth     0.7.5
@dcramer
Sentry member
@abhinavsingh

@dcramer No. Apart from these 5 lines related to sentry queue configuration nothing else has been changed inside sentry.conf.py

To add more info for your help, here is how celeryd for sentry gets started:

$ sentry --config=deps/sentry/sentry.conf.py celeryd --loglevel=debug

 -------------- celery@abhinavsingh.local v3.0.9 (Chiastic Slide)
---- **** ----- 
--- * ***  * -- [Configuration]
-- * - **** --- . broker:      amqp://guest@localhost:5672//
- ** ---------- . app:         default:0x106819b10 (djcelery.loaders.DjangoLoader)
- ** ---------- . concurrency: 4 (processes)
- ** ---------- . events:      OFF (enable -E to monitor this worker)
- ** ---------- 
- *** --- * --- [Queues]
-- ******* ---- . celery:      exchange:celery(direct) binding:celery
--- ***** ----- 

[Tasks]
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . raven.contrib.celery.send_raw
  . sentry.tasks.cleanup.cleanup
  . sentry.tasks.index.index_event
  . sentry.tasks.post_process.post_process_group
  . sentry.tasks.process_buffer.process_incr
  . sentry.tasks.store.store_event

Also not sure if this can be related, here is some warning messages from billiard/forking.py that I see when I start celeryd for my django application tasks:

$ ./manage.py celeryd worker --loglevel=debug
/Library/Python/2.7/site-packages/djcelery/loaders.py:124: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments!
  warnings.warn("Using settings.DEBUG leads to a memory leak, never "

 -------------- celery@abhinavsingh.local v3.0.9 (Chiastic Slide)
---- **** ----- 
--- * ***  * -- [Configuration]
-- * - **** --- . broker:      amqp://guest@localhost:5672//
- ** ---------- . app:         default:0x106649990 (djcelery.loaders.DjangoLoader)
- ** ---------- . concurrency: 4 (processes)
- ** ---------- . events:      OFF (enable -E to monitor this worker)
- ** ---------- 
- *** --- * --- [Queues]
-- ******* ---- . celery:      exchange:celery(direct) binding:celery
--- ***** ----- 

[Tasks]
  . celery.backend_cleanup
  . celery.chain
  . celery.chord
  . celery.chord_unlock
  . celery.chunks
  . celery.group
  . celery.map
  . celery.starmap
  . ...
  . ...
  . ...
  . raven.contrib.celery.send_raw

[2012-09-16 18:52:28,310: INFO/MainProcess] Configuring Raven for host: http://0de4f34df4dc4b1a58013f63d10875c3:cb12b0c371b34f63b8d943605308684a@logger.local.dev:/2
[2012-09-16 18:52:28,311: DEBUG/MainProcess] [Worker] Loading modules.
[2012-09-16 18:52:28,333: DEBUG/MainProcess] [Worker] Claiming components.
[2012-09-16 18:52:28,333: DEBUG/MainProcess] [Worker] Building boot step graph.
[2012-09-16 18:52:28,334: DEBUG/MainProcess] [Worker] New boot order: {ev, queues, beat, pool, mediator, autoreloader, timers, state-db, autoscaler, consumer}
[2012-09-16 18:52:28,384: DEBUG/MainProcess] Starting celery.worker.hub.Hub...
[2012-09-16 18:52:28,384: DEBUG/MainProcess] celery.worker.hub.Hub OK!
[2012-09-16 18:52:28,385: DEBUG/MainProcess] Starting celery.concurrency.processes.TaskPool...
[2012-09-16 18:52:28,441: WARNING/MainProcess] /Library/Python/2.7/site-packages/billiard/forking.py:455: UserWarning: Will add directory '/Users/abhinavsingh/git/local.dev' to path! This is necessary to accommodate pre-Django 1.4 layouts using setup_environ.
You can skip this warning by adding a DJANGO_SETTINGS_MODULE=settings environment variable.

  W_OLD_DJANGO_LAYOUT % os.path.realpath(project_dir)
@abhinavsingh

@dcramer Can I expect this to have a resolution? Can I do something at my end to help you out? Any more info you need from my end? I will be happy to help.

@dcramer
Sentry member
dcramer commented Sep 23, 2012

I actually wasnt able to reproduce this. We're using Celery 3.0.9 both on getsentry.com and on disqus and havent seen any issues like this

@abhinavsingh

@dcramer I did a fresh setup and now I have combination of sentry==5.0.13 with celery==3.0.10 running. Still I am facing same issue Received unregistered task of type 'sentry.tasks.store.store_event' along with No module named sentry as also raised by issue#612.

Can you kindly let me know the exact version of sentry and celery that you are using at getsentry.com and disqus. Things are fine with SENTRY_USE_QUEUE = False but on production i will surely like to have queues enabled.

@dcramer
Sentry member
dcramer commented Sep 24, 2012

Here's the following pinned requirements for getsentry.com's install (keep in mind some might be for plugins).

(We saw issues with celery 3.0.10 and are still bound at 3.0.9)


BeautifulSoup==3.2.1
Django==1.4.1
South==0.7.6
amqplib==1.0.2
anyjson==0.3.3
billiard==2.7.3.12
celery==3.0.9
celerymon==0.5.0
cssutils==0.9.10b1
django-bcrypt==0.9.2
django-celery==3.0.9
django-crispy-forms==1.1.4
django-hosts==0.4.1
django-indexer==0.3.0
django-jsonfield==0.6
django-paging==0.2.4
django-picklefield==0.2.0
django-social-auth==0.7.2
django-social-auth-trello==1.0.2
django-templatetag-sugar==0.1
dnspython==1.10.0
eventlet==0.9.16
gevent==0.13.8
greenlet==0.4.0
gunicorn==0.14.6
hiredis==0.1.1
httpagentparser==1.0.5
httplib2==0.7.4
kombu==2.4.6
lockfile==0.9.1
logan==0.5.0
newrelic==1.2.1.265
nexus==0.2.3
nydus==0.8.2
oauth2==1.5.211
path.py==2.2.2
phabricator==0.2.0
psycopg2==2.4.1
py-bcrypt==0.2
pynliner==0.4.0
python-dateutil==1.5
python-memcached==1.48
python-openid==2.2.5
pyvotal==0.2.1
pytz==2012d
raven==2.0.6
redis==2.4.11
requests==0.13.9
sentry==5.0.13
sentry-github==0.1.1
sentry-groveio==0.2.0
sentry-jira==0.6.1
sentry-pivotal==0.1.0
sentry-phabricator==0.6.2
sentry-sprintly==0.1.3
sentry-trello==1.1.4
sentry-webhooks==0.1.0
simplejson==2.3.0
stripe==1.6.1
tornado==2.4
wsgiref==0.1.2
@abhinavsingh

@dcramer To update this issue, this has been solved. Here is what was causing the trouble:

I am using celery queues both for my django virtual env and sentry virtual env. Both queue were using RabbitMQ. By default both celery instance were attaching to same default queue/exchange/routing key name which was taking the whole system in an infinite loop. Simple change of default queue names did the trick.

Thanks for the support. It's fixed, closing this bug.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.