Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concern - ArgoCD application name conflicts #63

Closed
tumido opened this issue Mar 25, 2021 · 6 comments
Closed

Concern - ArgoCD application name conflicts #63

tumido opened this issue Mar 25, 2021 · 6 comments
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.

Comments

@tumido
Copy link
Member

tumido commented Mar 25, 2021

Aggregating many application across multiple repositories and deploying to multiple clusters increases a risk of an application being named the same. On the other hand the nature of the OCP platform itself limits us to unique application names only, because all the Application resources land in the same namespace.

The situation is even more unfortunate in cases when different app-of-apps would try syncing different application specs with the same name. ArgoCD would end up with 2 competing apps syncing "the same" resource.

The possibility of this happening grows with the amount of clusters and teams onboarded.

Proposed solution

Use namePreffix or nameSuffix in kustomization.yaml for different sections of argocd-apps

Similar to this PR: operate-first/argocd-apps#101

This works only for well behaving apps, since it's part of the manifests
Doesn't work for conflicts between app-of-apps - 2 different app of apps can still apply a resource with the same name

Use Applicaton resource parameters

Works exceptionally well for app-of-apps, since it operates on app of apps resource spec level: always makes all applications deployed via an app of apps to follow the naming scheme.
Independent from the manifests deployed by app of apps

https://argoproj.github.io/argo-cd/user-guide/kustomize/#kustomize

Result of a discussion on this issue will be submitted as an ADR.

@sesheta
Copy link
Member

sesheta commented Oct 12, 2021

Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

/lifecycle stale

@sesheta sesheta added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Oct 12, 2021
@tumido
Copy link
Member Author

tumido commented Nov 9, 2021

/remove-lifecycle stale

@sesheta sesheta removed the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Nov 9, 2021
@sesheta
Copy link
Member

sesheta commented Feb 7, 2022

Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

/lifecycle stale

@sesheta sesheta added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Feb 7, 2022
@sesheta
Copy link
Member

sesheta commented Mar 9, 2022

Stale issues rot after 30d of inactivity.
Mark the issue as fresh with /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.

If this issue is safe to close now please do so with /close.

/lifecycle rotten

@sesheta sesheta added lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. and removed lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. labels Mar 9, 2022
@sesheta
Copy link
Member

sesheta commented Apr 8, 2022

Rotten issues close after 30d of inactivity.
Reopen the issue with /reopen.
Mark the issue as fresh with /remove-lifecycle rotten.

/close

@sesheta sesheta closed this as completed Apr 8, 2022
@sesheta
Copy link
Member

sesheta commented Apr 8, 2022

@sesheta: Closing this issue.

In response to this:

Rotten issues close after 30d of inactivity.
Reopen the issue with /reopen.
Mark the issue as fresh with /remove-lifecycle rotten.

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.
Projects
None yet
Development

No branches or pull requests

2 participants