Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubelet does not create symlinks to /var/log/containers #39225

Closed
mootezbessifi opened this issue Dec 25, 2016 · 7 comments
Closed

kubelet does not create symlinks to /var/log/containers #39225

mootezbessifi opened this issue Dec 25, 2016 · 7 comments

Comments

@mootezbessifi
Copy link

I am trying to set up EFK stack on my k8s cluster using ansible repo.

When i tried to browse kibana dashboard it shows me next output:
kibana

After making some research, i found out that i don't have any log detected by Fluentd.
I am running k8s 1.2.4 on minions and 1.2.0 on master.
What i succeeded to understand, is that kubelet creates /var/log/containers directory, and make symlinks from all containers running in the cluster into it. After that Fluentd mounts share /var/log volume from the minion and have eventually access to all logs containers. So , it can send these logs to elastic search.

In my case i had /var/log/containers created, but it is empty, even /var/lib/docker/containers does not contain any log file.
I used to use the following controllers and services for EFK stack setup:

es-controller.txt
es-service.txt
fluentd-es-ds.txt
kibana-controller.txt
kibana-service.txt

What was missing or doing wrong right now ?

@mootezbessifi
Copy link
Author

@MrHohn

@mootezbessifi
Copy link
Author

I changed fluentd-es.yaml as following:
fluentd-es.txt

But when i run a pod "named gateway", i got in the fluentd log the next error:
/var/log/containers/gateway-c3cuu_default_gateway-d5966a86e7cb1519329272a0b900182be81f55524227db2f524e6e23cd75ba04.log unreadable. It is excluded and would be examined next time

@mootezbessifi
Copy link
Author

Finally i found out what was causing the issue. when installing docker from CentOS 7 repo, there is an option (--log-driver=journald) which force docker to run log output to journald. The default behavior is to write these logs to json.log files.So, the only thing i had to do, delete the last mentioned option from /etc/sysconfig/docker.

@nelsonfassis
Copy link

Isn't it a way to make it work with journald?
@mootezbessifi thank you for the clarification btw. It's not the way I wanted but I can at least make some of it work :)

@helletheone
Copy link

i have the same problem in my openshift cluster. I guess the journald way is the only supported

"Aggregated logging is only supported using the journald driver in Docker. See Updating Fluentd’s Log Source After a Docker Log Driver Update for more information."

@nelsonfassis
Copy link

@helletheone More recent versions of docker write logs directly to journald. I've changed docker daemon to write logs as json.logs and now I'm able to use Filebeat (or FluentD if you prefer) just fine.
If your only problem is with docker, you should try this solution.

@bjlvfei
Copy link

bjlvfei commented Feb 18, 2020

How to resolve this issue or set OPTIONS='--log-driver=journald' in k8s?

Levi
Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants