Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

local-path-provisioner : Error starting daemon: invalid empty flag helper-pod-file #7321

Closed
ledroide opened this issue Feb 24, 2021 · 1 comment · Fixed by #7323
Closed
Labels
kind/bug Categorizes issue or PR as related to a bug.

Comments

@ledroide
Copy link
Contributor

After running cluster.yml, the local path storage (rancher implementation) is in error state.
I get exactly the same behavior on my 4 clusters, so it should be easy to reproduce.

$ kubectl get pod -n local-path-storage --context dev
NAME                                      READY   STATUS             RESTARTS   AGE
local-path-provisioner-66df45bfdd-m2v5v   0/1     CrashLoopBackOff   272        22h
$ kubectl logs deployment.apps/local-path-provisioner -n local-path-storage --context dev
time="2021-02-24T10:39:15Z" level=fatal msg="Error starting daemon: invalid empty flag helper-pod-file and it also does not exist at ConfigMap local-path-storage/local-path-config with err: configmaps \"local-path-config\" is forbidden: User \"system:serviceaccount:local-path-storage:local-path-provisioner-service-account\" cannot get resource \"configmaps\" in API group \"\" in the namespace \"local-path-storage\""
$ kubectl get cm,role,rolebinding,sa -n local-path-storage --context dev
NAME                          DATA   AGE
configmap/kube-root-ca.crt    1      21d
configmap/local-path-config   4      309d
NAME                                                               ROLE                                     AGE
rolebinding.rbac.authorization.k8s.io/psp:local-path-provisioner   ClusterRole/psp:local-path-provisioner   309d
NAME                                                    SECRETS   AGE
serviceaccount/default                                  1         309d
serviceaccount/local-path-provisioner-service-account   1         309d

My variables settings in inventory :

local_path_provisioner_enabled: true
local_path_provisioner_is_default_storageclass: "false"

Workaround

  • create a file role-local-path.yaml
---
apiVersion: rbac.authorization.k8s.io/v1
kind: Role
metadata:
  name: local-path-provisioner-workaround
  namespace: local-path-storage
rules:
- apiGroups:
    - ''
  resources:
    - configmaps
  verbs:
    - get
    - list
    - watch
---
apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
  name: local-path-provisioner-workaround
  namespace: local-path-storage
roleRef:
  apiGroup: rbac.authorization.k8s.io
  kind: Role
  name: local-path-provisioner-workaround
subjects:
- kind: ServiceAccount
  name: local-path-provisioner-service-account
  • apply to all clusters
$ kubectl apply -f role-local-path.yaml -n local-path-storage --context dev
  • restart the deployment
$ kubectl rollout restart deployment.apps/local-path-provisioner -n local-path-storage --context dev
$ kubectl get pod -n local-path-storage --context dev
NAME                                          READY   STATUS        RESTARTS   AGE
pod/local-path-provisioner-7fdcf54b7d-6vwbd   0/1     Terminating   38         172m
pod/local-path-provisioner-855db6667b-lnqlz   1/1     Running       0          3s
  • repeat apply+restart on all clusters

additional info

$ git rev-parse HEAD
125148e7a5d75d7f61f8f2f93210ee0a2cddd1b4

$ kubectl get deploy -o wide
NAME                     READY   UP-TO-DATE   AVAILABLE   AGE    CONTAINERS               IMAGES                                             SELECTOR
local-path-provisioner   1/1     1            1           309d   local-path-provisioner   docker.io/rancher/local-path-provisioner:v0.0.19   app=local-path-provisioner

$ kubectl get node --context dev
NAME               STATUS   ROLES                  AGE    VERSION
frd3kq-k8s01g      Ready    worker                 308d   v1.20.4
frd3kq-k8s02g      Ready    worker                 294d   v1.20.4
frd3kq-k8s03       Ready    worker                 309d   v1.20.4
frd3kq-k8s04       Ready    worker                 309d   v1.20.4
kube-dev-master1   Ready    control-plane,master   309d   v1.20.4
kube-dev-master2   Ready    control-plane,master   309d   v1.20.4
kube-dev-master3   Ready    control-plane,master   309d   v1.20.4
@ledroide ledroide added the kind/bug Categorizes issue or PR as related to a bug. label Feb 24, 2021
@floryut
Copy link
Member

floryut commented Feb 25, 2021

@ledroide I've submited a PR to fix this, thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants