Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Celery and Kombu dependencies #4842

Merged
merged 3 commits into from
Mar 20, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 11 additions & 11 deletions h/celery.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,9 @@
celery.conf.update(
# Default to using database number 10 so we don't conflict with the session
# store.
BROKER_URL=os.environ.get('CELERY_BROKER_URL',
broker_url=os.environ.get('CELERY_BROKER_URL',
os.environ.get('BROKER_URL', 'amqp://guest:guest@localhost:5672//')),
CELERYBEAT_SCHEDULE={
beat_schedule={
'purge-deleted-annotations': {
'task': 'h.tasks.cleanup.purge_deleted_annotations',
'schedule': timedelta(hours=1)
Expand All @@ -55,26 +55,26 @@
'schedule': timedelta(hours=6)
},
},
CELERY_ACCEPT_CONTENT=['json'],
accept_content=['json'],
# Enable at-least-once delivery mode. This probably isn't actually what we
# want for all of our queues, but it makes the failure-mode behaviour of
# Celery the same as our old NSQ worker:
CELERY_ACKS_LATE=True,
CELERY_DISABLE_RATE_LIMITS=True,
CELERY_IGNORE_RESULT=True,
CELERY_IMPORTS=(
task_acks_late=True,
worker_disable_rate_limits=True,
task_ignore_result=True,
imports=(
'h.tasks.admin',
'h.tasks.cleanup',
'h.tasks.indexer',
'h.tasks.mailer',
),
CELERY_ROUTES={
task_routes={
'h.tasks.indexer.add_annotation': 'indexer',
'h.tasks.indexer.delete_annotation': 'indexer',
'h.tasks.indexer.reindex_user_annotations': 'indexer',
},
CELERY_TASK_SERIALIZER='json',
CELERY_QUEUES=[
task_serializer='json',
task_queues=[
Queue('celery',
durable=True,
routing_key='celery',
Expand All @@ -87,7 +87,7 @@
# Only accept one task at a time. This also probably isn't what we want
# (especially not for, say, a search indexer task) but it makes the
# behaviour consistent with the previous NSQ-based worker:
CELERYD_PREFETCH_MULTIPLIER=1,
worker_prefetch_multiplier=1,
)


Expand Down
2 changes: 1 addition & 1 deletion h/cli/commands/devserver.py
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ def devserver(https, web, ws, worker, assets, beat):
m.add_process('ws', 'gunicorn --name websocket --reload --paste conf/development-websocket.ini %s' % gunicorn_args)

if worker:
m.add_process('worker', 'hypothesis --dev celery worker --autoreload -l INFO')
m.add_process('worker', 'hypothesis --dev celery worker -l INFO')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like we don't need this as we have other ways of reloading files automagically when running in dev mode, if I'm understanding what this feature was intended for. (I know it was deprecated but just being thorough about whether we need to replace it with something else or not and I think the answer is no we don't.)

Copy link
Member Author

@robertknight robertknight Mar 20, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have autoreloading of the webpage/API-serving code in h via gunicorn's --reload flag. I don't think we have any replacement mechanism for --autoreload in place here, and I'm not sure it was actually working anyway. I don't know what commonly used alternatives are but I found one suggestion to use watchdog.

I don't think it has been/will be much of an issue in practice because worker code just doesn't get changed that often.


if beat:
m.add_process('beat', 'hypothesis --dev celery beat')
Expand Down
4 changes: 2 additions & 2 deletions requirements.in
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ alembic
backports.functools_lru_cache
bcrypt
bleach >= 2.0
celery == 3.1.25 # Pin to latest 3.1.x to ensure forwards-compat with 4.x
celery >= 4.1
certifi
cffi
click
Expand All @@ -19,7 +19,7 @@ gunicorn
itsdangerous
jsonpointer == 1.0
jsonschema
kombu <3.1 # Pinned because of celery pin
kombu
mistune
newrelic
oauthlib
Expand Down
10 changes: 5 additions & 5 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,13 @@
# pip-compile --output-file requirements.txt requirements.in
#
alembic==0.9.7
amqp==1.4.9 # via kombu
anyjson==0.3.3 # via kombu
amqp==2.2.2 # via kombu
asn1crypto==0.22.0 # via cryptography
backports.functools-lru-cache==1.2.1
bcrypt==3.1.3
billiard==3.3.0.23 # via celery
billiard==3.5.0.3 # via celery
bleach==2.1.3
celery==3.1.25
celery==4.1.0
certifi==2016.2.28
cffi==1.7.0
chameleon==2.24 # via deform
Expand All @@ -36,7 +35,7 @@ itsdangerous==0.24
jinja2==2.8 # via deform-jinja2, pyramid-jinja2
jsonpointer==1.0
jsonschema==2.5.1
kombu==3.0.37
kombu==4.1.0
mako==1.0.4 # via alembic
markupsafe==0.23 # via jinja2, mako, pyramid-jinja2
mistune==0.7.3
Expand Down Expand Up @@ -74,6 +73,7 @@ translationstring==1.3 # via colander, deform, pyramid
unidecode==0.4.19 # via python-slugify
urllib3==1.16 # via elasticsearch
venusian==1.0
vine==1.1.4 # via amqp
webencodings==0.5.1 # via html5lib
webob==1.6.1 # via pyramid
ws4py==0.4.2
Expand Down