New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Celery multi 4.4.6 not working due to /var/run/celery mkdir #6199
Comments
please check the changelog and commits. |
@auvipy Still a problem on 4.4.6 and 4.4.5, even though fixed on 4.4.4. https://travis-ci.com/github/Andrew-Chen-Wang/test-cookiecutter-django-cache/jobs/355630502 |
it was a buggy feature initially. currently, it has been fixed |
The changelog says it was fixed in 4.4.4, right? The linked Travis job used 4.4.6. Unless you're asking me to look into the master branch. Edit: the hyperlink over 4.4.6. |
@auvipy It's still a problem on 4.4.6. Another user commented on the changelog's linked PR 6 days ago saying it still didn't work on 4.4.5. In the Travis config, you can see that it doesn't work when setting the PID in line 490. I checked the PR, and there was a test for the PID default to be in /var/run/celery which is where the error occurred in the first place. Going down the rabbit hole, we land in this issue saying that some file locations were incorrectly located in that they shouldn't be in the working directory. Note, the user was on 4.4.2, but 4.4.2 works correctly and has been for a long time for me. I also linked to a Travis job for 4.4.2 for the same configuration and repo and it succeeded. |
Could you actually run |
I'm confused. You're saying #6136 didn't fix the original issue? |
Hi @thedrow thanks for commenting. No, it didn't. I managed to at least get a checkmark in Travis, but celery won't run (this job includes celery report, but there's nothing special in there:
unless that's all there is). The Travis builds in that link is from me testing as many configurations as possible. The one that worked was making the directories for /var/run/celery, but I've never had to do that before with 4.4.2. |
It seems like when I tried to use the chown command myself, it didn't work. But using your Travis configuration made it work, so thank you for that! I'm still not sure why 4.4.3 to 4.4.6 forced us to perform these commands. The last version in which I didn't need to create new directories and chown was 4.4.2. Perhaps it's a bug in Travis? Some kind of permission basis that celery had to circumvent before? |
We have a similar issue that emerged yesterday when upgrading to 4.4.6. The crux of the problem is that Celery was ignoring our config that specified where to put the logs. See the traceback below.
Basically, we have a configuration file that specifies to use a location other than Our solution in the near term was to downgrade to Celery 4.4.2, as that version seems to have the expected behavior. |
@auvipy Please investigate and provide a fix. |
@raiderrobert Does the problem persist? I'm not using a configuration like yours, and my Travis works fine when I include the chown command. However, I haven't tried specifying a new location which seems to be the issue for two other issues and a PR that tried to resolve it. I also haven't tried using Celery without the chown command. I believe a new version of Celery came out, so you can try your configuration again. (Version 4.4.7 is the latest as of today) |
I am trying to switch from worker to multi. Worker is fine but I'm getting this same permissions issue with multi in docker.
|
Can't you change the path of the PID files? |
Thanks @thedrow, yes using --pidfile=./my.pid --logfile=./my.log works. I using celery-beat and structured logging to stdout/stderr for kubernetes target. Is it the case that you still need the pid / log regardless ? |
The docs https://docs.celeryproject.org/en/stable/reference/celery.bin.multi.html say
So why is it trying to create them in /var/run and /var/log? |
@auvipy Which PR did you merge that caused this? |
probably 1561cad one |
No. It's this. |
this was change as per #6017 as far as I can recall |
Again, I'm asking Why the change was made. |
apparently, the issue was a false-positive and all the upcoming 2-3 PR was merged based on that false-positive report I failed to verify/reason properly! @kwist-sgr @hanchau @mchataigner can you recheck, please? |
can you please check https://docs.celeryproject.org/en/latest/userguide/daemonizing.html and let us know we should undo the change or update the docs? |
I am running into issues with demonizing this and specifying the log file location as well as the run pid. Any and all help is welcomed here. Thanks in advance. Systemctl output of service status
Service File
VARIABLE File
|
I can't reproduce this issue with celery 5.1.2 (sun-harmonics) |
celery/celery#6199 Signed-off-by: David Galloway <dgallowa@redhat.com>
The problem is still present in We launch Celery via systemd. Our log path is defined this way:
This works and we see logs in this directory. However, during worker startup we see this exception in Sentry:
This is the place where
|
I'm having the same issue as @maxmalysh, is there a fix for this? |
so 1561cad didn't fix the issue, right? |
Checklist
master
branch of Celery.contribution guide
on reporting bugs.
for similar or identical bug reports.
for existing proposed fixes.
to find out if the bug was already fixed in the master branch.
in this issue (If there are none, check this box anyway).
Mandatory Debugging Information
celery -A proj report
in the issue.(if you are not able to do this, then at least specify the Celery
version affected).
master
branch of Celery.pip freeze
in the issue.to reproduce this bug.
Optional Debugging Information
and/or implementation.
result backend.
broker and/or result backend.
ETA/Countdown & rate limits disabled.
and/or upgrading Celery and its dependencies.
Related Issues and Possible Duplicates
Related Issues
Possible Duplicates
Environment & Settings
Celery version:
celery report
Output:Steps to Reproduce
Required Dependencies
Python Packages
pip freeze
Output:Other Dependencies
You can find them in the linked repository below
Minimally Reproducible Test Case
Expected Behavior
The celery task should work
Actual Behavior
It fails. Please visit my test repository here: https://travis-ci.com/github/Andrew-Chen-Wang/test-cookiecutter-django-cache/jobs/355630502
The job that succeeds use celery v4.4.2. On a package, we upgraded to 4.4.5, but it stopped working, saying
PermissionError: [Errno 13] Permission denied: '/var/run/celery'
The text was updated successfully, but these errors were encountered: