Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting spec.SyncPolicy hits segfault for ApplicationSets when no spec.template.spec.SyncPolicy is set #12424

Closed
3 tasks done
neurodrone opened this issue Feb 12, 2023 · 0 comments · Fixed by #12425
Closed
3 tasks done
Labels
bug Something isn't working component:applications-set Bulk application management related

Comments

@neurodrone
Copy link
Contributor

Checklist:

  • I've searched in the docs and FAQ for my answer: https://bit.ly/argocd-faq.
  • I've included steps to reproduce the bug.
  • I've pasted the output of argocd version.

Describe the bug

Running argocd appset get <name> hits a segmentation fault (nil pointer dereference) when spec.syncPolicy is set but no spec.template.spec.syncPolicy is set.

This is only a problem on the wide output format and doesn't occur when output is printed in JSON or YAML format.

To Reproduce

  1. Create an ApplicationSet with spec.SyncPolicy set. It could be set as follows:
    spec:
      syncPolicy:
        preserveResourcesOnDeletion: true
    
  2. Do NOT set any syncPolicy within spec.template.spec.
  3. Issue a create for this ApplicationSet. Could be done via argocd appset create <appset> for example.
  4. Run argocd appset get <appset>.
  5. This prints out a partial output and then errors out.
    Name:               external-secrets-appset
    Project:            default
    Server:             https://kubernetes.default.svc
    Namespace:          external-secrets
    Repo:               https://charts.external-secrets.io
    Target:             v0.5.7
    Path:               
    panic: runtime error: invalid memory address or nil pointer dereference
    [signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x389e2cb]
    

An example ApplicationSet I used here is as follows:

apiVersion: argoproj.io/v1alpha1
kind: ApplicationSet
metadata:
  name: external-secrets-appset
  namespace: argocd
spec:
  syncPolicy:
    preserveResourcesOnDeletion: true
  generators:
  - list:
      elements:
      - name: test
  template:
    metadata:
      name: '{{name}}-eso'
      labels:
        stork8s.do.co/app-name: external-secrets-operator
    spec:
      project: default
      destination:
        server: https://kubernetes.default.svc
        namespace: external-secrets
      source:
        repoURL: https://charts.external-secrets.io
        targetRevision: v0.5.7
        chart: external-secrets
        helm:
          values: |
            installCRDs: true
        syncOptions:
          - CreateNamespace=true

Note that spec.template.spec.syncOptions were set but spec.template.spec.syncPolicy was not set. This issue does not occur if spec.template.spec.syncPolicy is set.

Expected behavior

The sample output should look like follows:

Name:               external-secrets-appset
Project:            default
Server:             https://kubernetes.default.svc
Namespace:          external-secrets
Repo:               https://charts.external-secrets.io
Target:             v0.5.7
Path:               
SyncPolicy:         <none>

Screenshots

The complete error looks like follows:

Name:               external-secrets-appset
Project:            default
Server:             https://kubernetes.default.svc
Namespace:          external-secrets
Repo:               https://charts.external-secrets.io
Target:             v0.5.7
Path:               
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x0 pc=0x389e2cb]

goroutine 1 [running]:
github.com/argoproj/argo-cd/v2/cmd/argocd/commands.printAppSetSummaryTable(0xc000520000)
	/Users/vaibhav/code/argo-cd/cmd/argocd/commands/applicationset.go:347 +0x56b
github.com/argoproj/argo-cd/v2/cmd/argocd/commands.NewApplicationSetGetCommand.func1(0xc000ca5500?, {0xc000427d30?, 0x1, 0x1?})
	/Users/vaibhav/code/argo-cd/cmd/argocd/commands/applicationset.go:88 +0x297
github.com/spf13/cobra.(*Command).execute(0xc000ca5500, {0xc000427c30, 0x1, 0x1})
	/Users/vaibhav/code/argo-cd/vendor/github.com/spf13/cobra/command.go:920 +0x847
github.com/spf13/cobra.(*Command).ExecuteC(0xc000c60000)
	/Users/vaibhav/code/argo-cd/vendor/github.com/spf13/cobra/command.go:1044 +0x3bd
github.com/spf13/cobra.(*Command).Execute(...)
	/Users/vaibhav/code/argo-cd/vendor/github.com/spf13/cobra/command.go:968
main.main()
	/Users/vaibhav/code/argo-cd/cmd/main.go:57 +0x27a

Version

argocd: v2.5.7+e0ee345
  BuildDate: 2023-01-18T02:44:18Z
  GitCommit: e0ee3458d0921ad636c5977d96873d18590ecf1a
  GitTreeState: clean
  GoVersion: go1.18.9
  Compiler: gc
  Platform: linux/amd64
argocd-server: v2.5.7+e0ee345

I was running the above version on my server, but this affects the latest version as well.

Logs

Not appliable.

@neurodrone neurodrone added the bug Something isn't working label Feb 12, 2023
@keithchong keithchong added the component:applications-set Bulk application management related label Feb 13, 2023
neurodrone added a commit to neurodrone/argo-cd that referenced this issue Feb 16, 2023
…proj#12424)

Signed-off-by: Vaibhav Bhembre <vaibhav@digitalocean.com>
crenshaw-dev pushed a commit that referenced this issue Feb 17, 2023
…) (#12425)

Signed-off-by: Vaibhav Bhembre <vaibhav@digitalocean.com>
crenshaw-dev pushed a commit that referenced this issue Feb 17, 2023
…) (#12425)

Signed-off-by: Vaibhav Bhembre <vaibhav@digitalocean.com>
crenshaw-dev pushed a commit that referenced this issue Feb 17, 2023
…) (#12425)

Signed-off-by: Vaibhav Bhembre <vaibhav@digitalocean.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working component:applications-set Bulk application management related
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants