Misplaced or mismatched log files #38590
Unanswered
garywhiteford
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Edit: If you don't have any ideas for an answer on this one, check out my other question over at #38728.
I am running Airflow in Kubernetes with a worker (Celery executor) on a host not in Kubernetes. I had an issue when running a particular task. The task went to queue and never ran. The worker log showed that Redis had disconnected.
To solve this condition, I restarted the worker on the remote host and I restarted the broker (Redis) running in a Kubernetes pod. The task then was able to run.
However, now, when I try to view the task in the UI, it presents a log file and content that are not on my worker host.
The UI presents this for the task log:
However, there is no such file ("attempt=1.log.SchedulerJob.log") on the worker host.
Interestingly enough, there is a file named as such in the logs folder in the webserver pod.
There is one file ("attempt=1.log") in the logs folder on the worker host located as…
[basepath]/logs/dag_id=[my_dag]/run_id=scheduled__2024-03-27T11:00:00+00:00/task_id=[task2]/attempt=1.log
… and containing …
Did I do something wrong in how I solved the original issue?
How do I go about troubleshooting and fixing the current issue of "the missing (duplicate) log file"?
Side note: The time stamps say "UTC," but the times shown are really Central time US (the time on the servers).
Beta Was this translation helpful? Give feedback.
All reactions