Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Kyverno Image policy and validation of policy on resource crashes policyreporter #107

Closed
se77enn opened this issue Feb 15, 2022 · 6 comments

Comments

@se77enn
Copy link

se77enn commented Feb 15, 2022

Environment

  1. Kubectl version: Client: 1.23.3, Server: 1.23.1
  2. Minikube: 1.25.1
  3. Kyverno: 1.6.0
  4. PolicyReporter image: ghcr.io/kyverno/policy-reporter:2.0.0

Bug Description

When creating a image policy and creating a subsequent resource which triggers that policy (E.g an unsigned image on a pod), it appears to crash policy reporter.

Steps to reproduce.

  1. Install the helm repository
helm repo add policy-reporter https://kyverno.github.io/policy-reporter
helm repo update
  1. Only install the core application
helm upgrade --install policy-reporter policy-reporter/policy-reporter --create-namespace -n policy-reporter --set metrics.enabled=true --set api.enabled=true
  1. Install kyverno
  1. Install the test-image-policy.txt attached, convert to yaml - kubectl apply -f test-image-policy.yaml
  2. Create pod with kubectl run unsigned --image=ghcr.io/kyverno/test-verify-image:unsigned.
  3. Reproduce with kubectl run signed--image=ghcr.io/kyverno/test-verify-image:signed

Notice how to policy-reporter will produce an error in the logs and be constantly restarting.
The expected result is that policyreporter does not crash

Error

The errors found

2022/02/15 18:38:44 [WARNING] - Healthz Check: No policyreport.wgpolicyk8s.io and clusterpolicyreport.wgpolicyk8s.io crds are found
2022/02/15 18:38:46 [WARNING] - Healthz Check: No policyreport.wgpolicyk8s.io and clusterpolicyreport.wgpolicyk8s.io crds are found
2022/02/15 18:38:49 [WARNING] - Healthz Check: No policyreport.wgpolicyk8s.io and clusterpolicyreport.wgpolicyk8s.io crds are found
2022/02/15 18:38:49 [INFO] Resource registered: wgpolicyk8s.io/v1alpha2, Resource=clusterpolicyreports
2022/02/15 18:38:49 [INFO] Resource registered: wgpolicyk8s.io/v1alpha2, Resource=policyreports
E0215 18:38:49.225534       1 runtime.go:78] Observed a panic: &runtime.TypeAssertionError{_interface:(*runtime._type)(0x1741cc0), concrete:(*runtime._type)(nil), asserted:(*runtime._type)(0x16fbc20), missingMethod:""} (interface conversion: interface {} is nil, not string)
goroutine 52 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic({0x177cde0, 0xc000211d10})
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/runtime/runtime.go:74 +0x7d
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x40ed74})
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/runtime/runtime.go:48 +0x75
panic({0x177cde0, 0xc000211d10})
	/usr/local/go/src/runtime/panic.go:1038 +0x215
github.com/kyverno/policy-reporter/pkg/kubernetes.(*mapper).mapResult(0xc0000109f8, 0xc0003cdf80)
	/app/pkg/kubernetes/mapper.go:90 +0x734
github.com/kyverno/policy-reporter/pkg/kubernetes.(*mapper).MapPolicyReport(0xc000121cc0, 0xc0005cc230)
	/app/pkg/kubernetes/mapper.go:55 +0x485
github.com/kyverno/policy-reporter/pkg/kubernetes.(*k8sPolicyReportClient).watchCRD.func2({0x191f320, 0xc00037a7e8})
	/app/pkg/kubernetes/policy_report_client.go:108 +0x44
k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnAdd(...)
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/controller.go:231
k8s.io/client-go/tools/cache.(*processorListener).run.func1()
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/shared_informer.go:777 +0x9f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x7fcf7c455e60)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:155 +0x67
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000056f38, {0x1c78f20, 0xc0005d6000}, 0x1, 0xc0005d4000)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:156 +0xb6
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x3b9aca00, 0x0, 0x0, 0xc000056f88)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:133 +0x89
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:90
k8s.io/client-go/tools/cache.(*processorListener).run(0xc00010df00)
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/shared_informer.go:771 +0x6b
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:71 +0x88
panic: interface conversion: interface {} is nil, not string [recovered]
	panic: interface conversion: interface {} is nil, not string

goroutine 52 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash({0x0, 0x0, 0x40ed74})
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/runtime/runtime.go:55 +0xd8
panic({0x177cde0, 0xc000211d10})
	/usr/local/go/src/runtime/panic.go:1038 +0x215
github.com/kyverno/policy-reporter/pkg/kubernetes.(*mapper).mapResult(0xc0000109f8, 0xc0003cdf80)
	/app/pkg/kubernetes/mapper.go:90 +0x734
github.com/kyverno/policy-reporter/pkg/kubernetes.(*mapper).MapPolicyReport(0xc000121cc0, 0xc0005cc230)
	/app/pkg/kubernetes/mapper.go:55 +0x485
github.com/kyverno/policy-reporter/pkg/kubernetes.(*k8sPolicyReportClient).watchCRD.func2({0x191f320, 0xc00037a7e8})
	/app/pkg/kubernetes/policy_report_client.go:108 +0x44
k8s.io/client-go/tools/cache.ResourceEventHandlerFuncs.OnAdd(...)
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/controller.go:231
k8s.io/client-go/tools/cache.(*processorListener).run.func1()
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/shared_informer.go:777 +0x9f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x7fcf7c455e60)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:155 +0x67
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000056f38, {0x1c78f20, 0xc0005d6000}, 0x1, 0xc0005d4000)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:156 +0xb6
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x0, 0x3b9aca00, 0x0, 0x0, 0xc000056f88)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:133 +0x89
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:90
k8s.io/client-go/tools/cache.(*processorListener).run(0xc00010df00)
	/go/pkg/mod/k8s.io/client-go@v0.22.4/tools/cache/shared_informer.go:771 +0x6b
k8s.io/apimachinery/pkg/util/wait.(*Group).Start.func1()
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:73 +0x5a
created by k8s.io/apimachinery/pkg/util/wait.(*Group).Start
	/go/pkg/mod/k8s.io/apimachinery@v0.22.4/pkg/util/wait/wait.go:71 +0x88
@fjogeleit
Copy link
Member

Thanks for reporting, I will have a look on it.

@se77enn
Copy link
Author

se77enn commented Feb 15, 2022

test-image-policy-2.txt

@fjogeleit
Copy link
Member

I created a new version of the app. Can you try it with this version? If it will work, I will release a new chart version asap

helm upgrade --install policy-reporter policy-reporter/policy-reporter --create-namespace -n policy-reporter --set metrics.enabled=true --set api.enabled=true --set image.tag=2.0.1

@se77enn
Copy link
Author

se77enn commented Feb 15, 2022

@fjogeleit seems that it is working

Tried with policy reporter UI + kyverno plugin and with grafana

works-new
grafana-works

@fjogeleit
Copy link
Member

great, thanks for your response. Then I will release a new chart version.

@fjogeleit
Copy link
Member

I released Chart Helm Version 2.2.4 with the new app version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants