-
Notifications
You must be signed in to change notification settings - Fork 16.9k
[stable/airflow] Errno 13 - Permission denied: '/opt/airflow/logs/scheduler #23589
Comments
we are facing the same issue with Permission denied for /opt/airflow/logs/scheduler |
We are facing same problem too |
Facing the same issue. Thing were running perfect when no EFS. Issue occurred when we started using EFS as PV. |
I have solved the issue by calling "chmod -R 777 logs/" from root folder (outside container). Maybe it isn't best solution, but it works. |
how did you manage to do it. I am using helm chart in which I added volume as "/opt/airflow/logs" |
@gj-9lt you can mount a volume to the pod and create a directory at the root directory of the container. Something like |
How did you manage to solve it? I occur the same error: "PermissionError: [Errno 13] Permission denied: '/opt/airflow/logs/dag_processor_manager/dag_processor_manager.log'" |
I specified the following in the
With this you need to have a PVC created I don't think this issue should be closed yet, I still think it's a bug because the documentation says that the logs can work if written to Quote from the documentation:
I feel either the documentation needs to get updated and reflect that writing to this path won't work and to use a different one, or the permissions issue should be solved. |
When I tried working with the a normal pod(Which will be created by a root user), everything works fine. But with Airflow, things are not good. I even tried giving full permissions to the directory which I was working with, but no luck. |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Any further update will cause the issue/pull request to no longer be considered stale. Thank you for your contributions. |
Not stale. |
If this is still an issue, please raise it on the new repo: https://github.com/airflow-helm/charts/tree/main/charts/airflow The |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Any further update will cause the issue/pull request to no longer be considered stale. Thank you for your contributions. |
I am running containers in docker, Workaround for this issue is to add the volume for logs volumes:
aiflow_logs:
services:
webserver:
image:
volumes:
....
- aiflow_logs:/opt/airflow/logs |
To solved this issue, you can change volumes to directory /home/user. |
I had the same error message. I fixed it by setting the selinux label on the volume in docker-compose.yaml. I don't know if this will fix the issue of the original poster. error
troubleshoot
fix
environment
links |
|
@ralsouza is this still an issue on the new chart (as I have set uid/gid correctly I believe): |
Hi I had the same issue and I got it resolved buy changing permissions inside the containers: docker exec -u root -ti my_airflow_container_id bash In this case it's the scheduler container - after this you cd out to the outer base dir Then do "chmod -R 777 /opt" |
I also have the same problem. I am create a non-root user to run the airflow image and here is the error:
chmod -R 777 the folder does not work, and the gid, uid and airflowhomepath is set as follow
I wonder why Airflow does not have the permission to the logs folder? Does anyone have similar issue? |
@ntoxlut @nicnguyen3103 if you are having this issue on the newer versions of this chart, which are developed/available here, please raise a new issue on that repo. |
running on my own PC for testing: mkdir -p dags logs plugins
chown 50000:50000 dags logs plugins |
For those watching, please migrate to the newer versions of the chart on the new airflow-helm/charts repo. If you follow the instructions for "Option 1" under "How to persist logs?", you won't end up with permission errors. PS: I realise there are a lot of |
confirm this works for me(docker-compose) |
Just wanted to ping everyone watching this and say the version NOTE: the user-community chart is maintained descendent of |
This works for me: chmod 777 /var/run/docker.sock |
Worked for me, tks! |
why it can work, cause we have problems permission with /opt/airflow/logs |
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Any further update will cause the issue/pull request to no longer be considered stale. Thank you for your contributions. |
Hi, folks - as @thesuperzapper explained, this issue is solved in the community-maintained chart available at https://github.com/airflow-helm/charts/.
|
Describe the bug
When mounting persistent volume for logs - encountering
Permission denied: '/opt/airflow/logs/scheduler
due to mismatch in uid:gid permissions in logs directory. Airflow user runs as 50000 and /opt/airflow/logs is airflow:root (uid:gid).Version of Helm and Kubernetes:
Helm: v3.0.2
Kubernetes: v1.17.0
Which chart:
stable/airflow
What happened:
After installing and configuring Persistent Volume (AWS EFS) storage for logs - airflow is unable to write to /opt/airflow/logs
This is visible during this initial startup script.
What you expected to happen:
Logs should be stored in persistent volume
How to reproduce it (as minimally and precisely as possible):
In this post I share my
values.yaml
as well as the yaml to create the PV and PVC. Again I'm using EFS storage on AWS.helm pull stable/airflow --untar=true
Replace the
values.yaml
with the one in above linked post (after creating PV/PVC from provided yaml)helm install airflow airflow/ --values airflow/values.yaml --namespace airflow
Anything else we need to know:
I've tried debugging by editing the scheduler-deployment.yaml file and adding some commands in the section where airflow initdb is called. When I create a new directory it works. The uid:gid for this directory is airflow:airflow. When I see who owns /opt/airflow/logs it's airflow:root. Also, when I mount the external volume it gets mounted as 1000:1000. I've tried changing the RUN_AS_USER config but apparently it doesn't work if it's not 50000. So what I don't understand is howcome the startup script running as user 'airflow' is not allowed to create files in a directory with owner airflow? This seems like a bug.
I've tried to set the securityContext for scheduler but this didn't work
All I got was the error
bash: line 2: /home/airflow/airflow_env.sh: Permission denied
.I've also checked the airflow-scheduler deployment and I see that there is a mount for logs:
And there is a corresponding volume:
In the logs section of the helm chart I set enable to true, and set the existing claim as
efs-claim
. I've also set some configs in the airflow.config section. There are more but these I saw were relevant to maybe the issue.when I exec into the airflow-scheduler I do not see any log folders created. /opt is empty.
The text was updated successfully, but these errors were encountered: