Skip to content
This repository has been archived by the owner on Jul 26, 2022. It is now read-only.

RFE: Restrict watching to a single namespace. #106

Closed
derrickburns opened this issue Jun 30, 2019 · 16 comments
Closed

RFE: Restrict watching to a single namespace. #106

derrickburns opened this issue Jun 30, 2019 · 16 comments
Labels
help wanted Extra attention is needed Stale

Comments

@derrickburns
Copy link

No description provided.

@jeffpearce
Copy link
Contributor

jeffpearce commented Jun 30, 2019

Or maybe from an array of namespaces

@jeffpearce
Copy link
Contributor

@jeffpearce
Copy link
Contributor

I was just suggesting it might be useful to be able to restrict to watching more than one namespace

@derrickburns
Copy link
Author

derrickburns commented Jul 1, 2019 via email

@jeffpearce
Copy link
Contributor

Hi @derrickburns - thinking some more about this, and I don't quite understand the proposal. The daemon currently only watches secrets that are explicitly added to kubernetes. Can you help me understand which scenarios this would enable?

@derrickburns
Copy link
Author

derrickburns commented Jul 1, 2019 via email

@jeffpearce
Copy link
Contributor

Sorry, I should have asked that first. That sounds really useful.

@jeffpearce
Copy link
Contributor

One straw person proposal would be to add an env variable to configure this. @silasbw, @satish-ravi, thoughts?

@silasbw
Copy link
Contributor

silasbw commented Jul 2, 2019

We'd want to adjust the RBAC settings too, so we'd need to adjust helm template files too.

@derrickburns
Copy link
Author

ping

@silasbw silasbw added the help wanted Extra attention is needed label Aug 5, 2019
@muenchhausen
Copy link
Contributor

muenchhausen commented Aug 8, 2019

I like this idea! This could solve that usecase:

Given: Two teams A & B working in dedicated namespaces with access to own sets of AWS secrets.

When: Team A creates an external secret in his namespace referring to AWS secrets of Team B.

Then: Kubernetes secret must not be created. kubernetes-external-secrets must raise an error.

A solution idea would be

  • run multiple kubernetes-external-secrets instances in all team namespaces
  • let kubernetes-external-secrets instances listen on own namespace only
  • assign kubernetes-external-secrets dedicated AWS credentials allowing to acceess just own AWS credentials
  • the necessary RBAC rights for kubernetes-external-secrets could be reduced

@red8888
Copy link
Contributor

red8888 commented Sep 13, 2019

Is this going to be implemented? It would be so amazing

This is what Im doing now:

  • My application's charts have the external secrets chart added as a requirement
  • The applications are deployed to their own namespaces
  • They use envVarsFromSecret to set a different set of AWS creds which has only the access the app nees
  • When deployed with helm a separate external secrets deployment runs alongside each app creating secrets for it with only the access the application needs

This is almost seamless but external secrets manager deployments have access to each others secrets. Also, I have to prefix all my secrets with the name of the app to give them a unique name otherwise the multiple external secrets manager deployments try to update each others secrets!

If your deploying your apps with helm being able to reference this as just another helm dependency is a really elegant way to give your apps/deployments granular access to external secrets

@JeroenRijks
Copy link

When restriction of scope to a single namespace is implemented, would it make sense to conditionally use Roles instead of ClusterRoles, if SCOPE_NAMESPACE is set?

@JeroenRijks
Copy link

Could someone with write access review/merge Flydiverny's PR, to close this issue? #193

@github-actions
Copy link

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 30 days.

@github-actions github-actions bot added the Stale label Jan 29, 2021
@github-actions
Copy link

github-actions bot commented Mar 1, 2021

This issue was closed because it has been stalled for 30 days with no activity.

@github-actions github-actions bot closed this as completed Mar 1, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
help wanted Extra attention is needed Stale
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants