Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Worker and Scheduler constantly restarting #48

Closed
ariksidney opened this issue Nov 9, 2016 · 4 comments
Closed

Worker and Scheduler constantly restarting #48

ariksidney opened this issue Nov 9, 2016 · 4 comments

Comments

@ariksidney
Copy link

Hi there,

We are running Airflow inside Docker since about 1.5 month without any bigger issues. However since a few days the worker and Scheduler keeps restarting without executing any of the DAGs.
The log of the worker shows an Unrecoverable error: TypeError:

[2016-11-09 14:41:04,432: CRITICAL/MainProcess] Unrecoverable error: TypeError('unorderable types: NoneType() <= int()',)
Traceback (most recent call last):
  File "/usr/local/lib/python3.4/dist-packages/celery/worker/worker.py", line 203, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python3.4/dist-packages/celery/bootsteps.py", line 115, in start
    self.on_start()
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 143, in on_start
    self.emit_banner()
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 159, in emit_banner
    string(self.colored.reset(self.extra_info() or '')),
  File "/usr/local/lib/python3.4/dist-packages/celery/apps/worker.py", line 188, in extra_info
    if self.loglevel <= logging.INFO:
TypeError: unorderable types: NoneType() <= int()

I have no idea what caused this issue. Do you have an idea or a clue what could go wrong?

@puckel
Copy link
Owner

puckel commented Nov 10, 2016

Scheduler restart every 5 DAG run (ie sheduler command).

Did you deploy a new DAG ?

@ariksidney
Copy link
Author

No I didn't deploy a new DAG. I tried to remove all the DAGs but the error still occurs.
I also discovered that if I go to the Graph view of a DAG, the Task is marked as 'no Status' but if I go to the DAG runs I can see that the Status is set to running (but it appears that it's not getting executed).

Thanks in advance

@ariksidney
Copy link
Author

Ok I think I found the issue. The problem was that after the last build Airflow took Celery 4.0.0 which caused the above error. I switched back to Celery 3.1.23 by changeing the airflow_base Dockerfile like this:

RUN    pip install airflow[postgresql,hive,password]==$AIRFLOW_VERSION \
    && pip install celery==3.1.23

puckel added a commit that referenced this issue Nov 14, 2016
@puckel
Copy link
Owner

puckel commented Nov 14, 2016

Thanks for reporting this issue. I fixed Celery version in the release 1.7.1.3-4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants