Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: not enough values to unpack (expected 3, got 0) #4178

Closed
2 tasks done
LoginQin opened this issue Aug 2, 2017 · 11 comments
Closed
2 tasks done

ValueError: not enough values to unpack (expected 3, got 0) #4178

LoginQin opened this issue Aug 2, 2017 · 11 comments

Comments

@LoginQin
Copy link

LoginQin commented Aug 2, 2017

Checklist

system info

software -> celery:4.1.0 (latentcall) kombu:4.1.0 py:3.6.2
billiard:3.5.0.3 py-amqp:2.2.1
platform -> system:Windows arch:32bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:pyamqp results:redis
broker_url: 'amqp://guest:@..*.:5672//'
result_backend: 'redis://:
@.../0' I try sqlite and remove backend too, but not working

  • I have included the output of celery -A proj report in the issue.
    (if you are not able to do this, then at least specify the Celery
    version affected).
  • I have verified that the issue exists against the master branch of Celery.

Steps to reproduce

I follow the tutorial which on official website to run task.py in my win10,
but get error

Expected behavior

print success

Actual behavior

I name the file tasks.py to mytask.pl, so I try to run in

celery -A mytask worker -l info

It connecting OK.

BUT when I try to run the test add.delay(2,2) example , I get the error below:

[2017-08-02 19:59:04,777: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "d:\python\python36-32\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "d:\python\python36-32\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2017-08-02 20:04:30,870: INFO/MainProcess] Received task: mytask.hello[ec84d3ba-98ac-44bc-be5e-09190c2712e0]
[2017-08-02 20:04:30,873: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "d:\python\python36-32\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "d:\python\python36-32\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)

My Solution

Try to uninstall celery 4.1.0 and replace to 3.1.24

pip uninstall celery
pip install celery==3.1.24

Than It Work Fine For Me ! Everything is OK! I think this information is useful to you

This version report:

software -> celery:3.1.24 (Cipater) kombu:3.0.37 py:3.6.2
billiard:3.3.0.23 py-amqp:1.4.9
platform -> system:Windows (Win10) arch:32bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:pyamqp results:redis://:**@****/0

Why 3.1.24 ?

It's my guess , Just looking for a lower version than 4

@davidt99
Copy link

davidt99 commented Aug 7, 2017

Not sure if related, but windows is no longer supported in celery 4.

@twschiller
Copy link

twschiller commented Aug 8, 2017

Also seeing on Windows in the following environment (working fine on my Mac):

  • Celery 4.1.0
  • Azure App Service 64bit
  • Python 3.6.1 (via Python extension)
  • Azure Redis for broker & backend

@thedrow
Copy link
Member

thedrow commented Aug 8, 2017

Windows support is provided on a best-effort basis. We do not have an active maintainer that is interested in providing support for Windows.
The unit tests pass so there must be a more complex issue here.
I don't have access to a Windows machine. If any of you can debug and issue a PR that would be lovely.

@thedrow
Copy link
Member

thedrow commented Aug 8, 2017

Actually this is a duplicate of #4081. There's a fix for it in #4078 that is pending for test coverage.
If any of you wants to help resolving this issue we need an integration test that proves the fix works as expected.

@tpaljor
Copy link

tpaljor commented Nov 6, 2017

I found a workaround:

celery -A your_app_name worker --pool=solo -l info

@fohrloop
Copy link

While solo-pooling works, it is a single threaded execution pool, which means that there is no concurrency at all.

Another working solution is to use eventlet (pip install eventlet -> celery -A your_app_name worker --pool=eventlet). This way it is possible to have parallel-running tasks on Windows.

@sergei-maertens
Copy link

Confirming the solution by @np-8 on Windows Server 2012 R2, with Python 3.5. We had to bump billiard to the latest patch version to fix a pickle error too though.

@Vichoko
Copy link

Vichoko commented Mar 1, 2018

@np-8 solution worked for me too.
Specifications:

  • Windows 10
  • Celery 4.1
  • RabbitMQ 3.7.4
  • Python 3.1
  • Django 2.0

It doesn't have parallelism but at least works for testing, as i'm developing in Windows (The app will be in a Linux platform for production).

@sergei-maertens
Copy link

Eventually we ditched the solution and are not running celery on Windows anymore. Sometimes, when saving files in Django (models.FileField), something in eventlet would error (sorry, don't have the exact error at hand). But, it seems to be in the os.py monkeypatching.

Another thing is that starting up celery takes ages (10-15 minutes), and during this startup time, no tasks seem to be picked up.

@rajivfx
Copy link

rajivfx commented Mar 10, 2018

pip install eventlet
celery -A worker -l info -P eventlet

This works on
window 10 + celery 4.1 + python 3

@CoCo269
Copy link

CoCo269 commented Apr 2, 2018

#4078 may help

at worker side:
set FORKED_BY_MULTIPROCESSING = 1
then
celery -A myworker worker --loglevel=info
done!

@celery celery locked as resolved and limited conversation to collaborators Apr 3, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

10 participants