Skip to content
This repository has been archived by the owner on Feb 5, 2021. It is now read-only.

Fluentd only runs on worker nodes #42

Closed
AlexB138 opened this issue Sep 16, 2017 · 6 comments
Closed

Fluentd only runs on worker nodes #42

AlexB138 opened this issue Sep 16, 2017 · 6 comments

Comments

@AlexB138
Copy link
Contributor

Per the README, the intention of the pod is to "run an instance of this container on each physical underlying host in the cluster". As of Kubernetes 1.6, the pods will only run on the worker nodes, due to the built in taint on masters.

I see two ways to solve this. Which is best depends on the maintainers intentions:

  1. Add a toleration for the built in master taint, node-role.kubernetes.io/master:NoSchedule. This will cause fluentd to also schedule on the master and ship its logs.
  2. Add the global toleration operator: "Exists". This would cause the fluentd pod to schedule on any node in a cluster, regardless of taints. This is a good bit more aggressive, but would match to stated intention.

Happy to do a PR to fix this if desired.

@AlexB138
Copy link
Contributor Author

@AlexB138
Copy link
Contributor Author

Alternatively, a section could just be added to the README explaining how to add the toleration to schedule onto masters/other tainted nodes.

@frankreno
Copy link
Contributor

@AlexB138 : considering that the plugin (prior to 1.6) ran on the Nodes and Master without extra config, I am inclined to modify the settings to preserve that behavior. I prefer option 1 as it is less aggressive. I still think we need to update the docs for handling scenario 2 on other tainted nodes but by default we should collect from the master as that is where some of the k8s system pods run. Do you agree?

@AlexB138
Copy link
Contributor Author

@frankreno Yep, I agree on both counts.

I'll put together a PR asap.

@AlexB138
Copy link
Contributor Author

AlexB138 commented Sep 20, 2017

PR is up. Sorry for the delay, @frankreno !

@frankreno
Copy link
Contributor

fixed by #43 thanks again!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants