Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Operator pod dont have all RBAC perms after installation #3645

Closed
idanl21 opened this issue May 9, 2023 · 7 comments
Closed

Operator pod dont have all RBAC perms after installation #3645

idanl21 opened this issue May 9, 2023 · 7 comments
Labels

Comments

@idanl21
Copy link

idanl21 commented May 9, 2023

After following your installation guide using kustiomize, i noticed that the operator pod have a lot of RBAC errors (for example - W0509 13:36:30.140221 1 reflector.go:324] k8s.io/client-go@v0.24.2/tools/cache/reflector.go:167: failed to list *v1.ConfigMap: configmaps is forbidden: User "system:serviceaccount:postgres-operator:pgo" cannot list resource "configmaps" in API group "" at the cluster scope)

The logs from the installation -

`customresourcedefinition.apiextensions.k8s.io/pgupgrades.postgres-operator.crunchydata.com serverside-applied
customresourcedefinition.apiextensions.k8s.io/postgresclusters.postgres-operator.crunchydata.com serverside-applied
serviceaccount/pgo serverside-applied
serviceaccount/postgres-operator-upgrade serverside-applied
clusterrole.rbac.authorization.k8s.io/postgres-operator-upgrade serverside-applied
clusterrolebinding.rbac.authorization.k8s.io/postgres-operator-upgrade serverside-applied
deployment.apps/pgo serverside-applied
deployment.apps/pgo-upgrade serverside-applied
Apply failed with 2 conflicts: conflicts with "helm" using rbac.authorization.k8s.io/v1:

  • .rules
  • .metadata.labels.app.kubernetes.io/name
    Please review the fields above--they currently have other managers. Here
    are the ways you can resolve this warning:
  • If you intend to manage all of these fields, please re-run the apply
    command with the --force-conflicts flag.
  • If you do not intend to manage all of the fields, please edit your
    manifest to remove references to the fields that should keep their
    current managers.
  • You may co-own fields by updating your manifest to match the existing
    value; in this case, you'll become the manager if the other manager(s)
    stop managing the field (remove it from their configuration).
    See https://kubernetes.io/docs/reference/using-api/server-side-apply/#conflicts
    Apply failed with 1 conflict: conflict with "helm" using rbac.authorization.k8s.io/v1: .metadata.labels.app.kubernetes.io/name
    Please review the fields above--they currently have other managers. Here
    are the ways you can resolve this warning:
  • If you intend to manage all of these fields, please re-run the apply
    command with the --force-conflicts flag.
  • If you do not intend to manage all of the fields, please edit your
    manifest to remove references to the fields that should keep their
    current managers.
  • You may co-own fields by updating your manifest to match the existing
    value; in this case, you'll become the manager if the other manager(s)
    stop managing the field (remove it from their configuration).
    See https://kubernetes.io/docs/reference/using-api/server-side-apply/#conflicts`

Are you familliar with that issue ?
Thanks !

@idanl21
Copy link
Author

idanl21 commented May 10, 2023

@tony-landreth
Hey :)
any new information there ?

@ValClarkson
Copy link
Contributor

HI @idanl21,
Thanks for reporting this issue. I've created a ticket.

@idanl21
Copy link
Author

idanl21 commented May 16, 2023

Hey
@ValClarkson
Is it a known issue ?

@andrewlecuyer
Copy link
Collaborator

@idanl21 it appears as though the Kustomize install is conflicting with another/prior install. Specifically, it appears to be conflicting with a Helm install.

Did you previously install the operator using Helm?

@benjaminjb
Copy link
Contributor

Hello @idanl21, I hope the above comments have been helpful. I'm going to close this issue, but if you're still experiencing this problem, please reopen so we can continue working on it.

@ksgnextuple
Copy link

Hi we are facing this issue. We installed using kustomize and it was a fresh installation.

@dsessler7
Copy link
Contributor

Hey @ksgnextuple, since we you are not the OP (and your setup might be different from theirs) and this issue has already been closed, please open a new issue with all pertinent details (your spec, k8s environment details, error logs, etc).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

7 participants