Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add server metadata, soft-anti-affinity and local ssd flavors for gx-scs #742

Merged
merged 7 commits into from
Jun 26, 2024

Conversation

chess-knight
Copy link
Member

@chess-knight chess-knight commented Jun 14, 2024

Fixes #741

Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
@chess-knight chess-knight linked an issue Jun 14, 2024 that may be closed by this pull request
Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
@scszuulapp
Copy link

scszuulapp bot commented Jun 17, 2024

Build failed (e2e-quick-test pipeline).
https://zuul.scs.community/t/SCS/buildset/cc45695c8a78434084a51aed9bd41a00

k8s-cluster-api-provider-e2e-quick TIMED_OUT in 1h 04m 00s

Warning:

Sonobouy results === Collecting results === time="2024-06-17T05:36:51Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy time="2024-06-17T05:36:51Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[sonobuoy-serviceaccount-sonobuoy]" time="2024-06-17T05:36:52Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[sonobuoy-serviceaccount-sonobuoy]" Plugin: e2e Status: passed Total: 7394 Passed: 5 Failed: 0 Skipped: 7389

Plugin: systemd-logs
Status: passed
Total: 5
Passed: 5
Failed: 0
Skipped: 0

Run Details:
API Server version: v1.28.11
Node health: 5/5 (100%)
Pods health: 42/42 (100%)
Errors detected in files:
Errors:
36386 podlogs/kube-system/kube-apiserver-pr742-2d12ac-lqg8q-s9bgj/logs/kube-apiserver.txt
129 podlogs/kube-system/snapshot-controller-7c5dccb849-hnp4t/logs/snapshot-controller.txt
88 podlogs/kube-system/etcd-pr742-2d12ac-lqg8q-s9bgj/logs/etcd.txt
31 podlogs/kube-system/kube-scheduler-pr742-2d12ac-lqg8q-9jgn6/logs/kube-scheduler.txt
11 podlogs/kube-system/kube-controller-manager-pr742-2d12ac-lqg8q-s9bgj/logs/kube-controller-manager.txt
8 podlogs/kube-system/etcd-pr742-2d12ac-lqg8q-9jgn6/logs/etcd.txt
8 podlogs/kube-system/kube-apiserver-pr742-2d12ac-lqg8q-9jgn6/logs/kube-apiserver.txt
5 podlogs/kube-system/cilium-zw2rg/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-xdv85/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-zltcq/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-2625p/logs/cilium-agent.txt
4 podlogs/kube-system/kube-controller-manager-pr742-2d12ac-lqg8q-9jgn6/logs/kube-controller-manager.txt
4 podlogs/kube-system/cilium-vflsg/logs/cilium-agent.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-5qtqm/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-qtd7p/logs/node-driver-registrar.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-nx8v8/logs/cilium-operator.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-nnwp9/logs/openstack-cloud-controller-manager.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-92098a2a3025488f/logs/e2e.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-ltd26/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-4fdvj/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-pjsgj/logs/node-driver-registrar.txt
1 podlogs/kube-system/snapshot-controller-7c5dccb849-2s49w/logs/snapshot-controller.txt
1 podlogs/kube-system/kube-scheduler-pr742-2d12ac-lqg8q-s9bgj/logs/kube-scheduler.txt
Warnings:
228 podlogs/kube-system/etcd-pr742-2d12ac-lqg8q-s9bgj/logs/etcd.txt
41 podlogs/kube-system/kube-apiserver-pr742-2d12ac-lqg8q-s9bgj/logs/kube-apiserver.txt
35 podlogs/kube-system/kube-apiserver-pr742-2d12ac-lqg8q-9jgn6/logs/kube-apiserver.txt
18 podlogs/kube-system/kube-scheduler-pr742-2d12ac-lqg8q-9jgn6/logs/kube-scheduler.txt
11 podlogs/kube-system/etcd-pr742-2d12ac-lqg8q-9jgn6/logs/etcd.txt
9 podlogs/kube-system/csi-cinder-nodeplugin-qtd7p/logs/node-driver-registrar.txt
6 podlogs/kube-system/cilium-zltcq/logs/cilium-agent.txt
4 podlogs/kube-system/openstack-cloud-controller-manager-nnwp9/logs/openstack-cloud-controller-manager.txt
3 podlogs/kube-system/kube-scheduler-pr742-2d12ac-lqg8q-s9bgj/logs/kube-scheduler.txt
3 podlogs/kube-system/kube-proxy-g6klx/logs/kube-proxy.txt
2 podlogs/kube-system/csi-cinder-controllerplugin-57489947fb-44fkr/logs/csi-attacher.txt
2 podlogs/kube-system/csi-cinder-controllerplugin-57489947fb-44fkr/logs/csi-provisioner.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-pjsgj/logs/node-driver-registrar.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-4fdvj/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-pjsgj/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-qtd7p/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-ltd26/logs/node-driver-registrar.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-nx8v8/logs/cilium-operator.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-zkqdb/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/cilium-zw2rg/logs/cilium-agent.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-4fdvj/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-ltd26/logs/liveness-probe.txt
1 podlogs/kube-system/cilium-2625p/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-xdv85/logs/cilium-agent.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-92098a2a3025488f/logs/e2e.txt
1 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
time="2024-06-17T05:36:53Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy
time="2024-06-17T05:36:53Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[]"
time="2024-06-17T05:36:53Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[]"

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 6 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 5 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-17 05:36:57 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}
...
...
...
...

Namespace "sonobuoy" has been deleted

Deleted all ClusterRoles and ClusterRoleBindings.
=== Sonobuoy conformance tests passed in 113s ===
make[1]: Leaving directory '/home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform'

@chess-knight
Copy link
Member Author

chess-knight commented Jun 17, 2024

Build failed (e2e-quick-test pipeline). https://zuul.scs.community/t/SCS/buildset/cc45695c8a78434084a51aed9bd41a00

k8s-cluster-api-provider-e2e-quick TIMED_OUT in 1h 04m 00s

Warning:

Sonobouy results

Fault
Message No valid host was found. There are not enough hosts available.
Code 500
Details
Created June 17, 2024, 5:20 a.m.

It seems like there are only two hosts available for "SCS-2V-4-20s" flavor right now, but we have 3 control-plane nodes with an anti-affinity policy here.

@scszuulapp
Copy link

scszuulapp bot commented Jun 17, 2024

Build succeeded (e2e-quick-test pipeline).
https://zuul.scs.community/t/SCS/buildset/cf6946aacb7442c39cbd754c0b0e6311

✔️ k8s-cluster-api-provider-e2e-quick SUCCESS in 39m 15s

Warning:

SCS Compliance results Testing SCS Compatible KaaS version v2 ******************************************************* Testing standard Kubernetes version policy ... Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ... INFO: Checking cluster specified by default context in /home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform/pr742-dc6542.yaml.gx-scs-zuul. INFO: The K8s cluster version 1.28.11 of cluster 'pr742-dc6542-admin@pr742-dc6542' is still in the recency time window.

... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...
WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The nodes are distributed across 2 host-ids.
WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The nodes are distributed across 3 host-ids.
The config file under ./config.yaml couldn't be found, falling back to the default config.
... returned 0 errors, 0 aborts


Testing standard CNCF Kubernetes conformance ...
Reference: https://github.com/cncf/k8s-conformance/tree/master ...
WARNING: No check tool specified for CNCF Kubernetes conformance


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v2: PASSED
Testing SCS Compatible KaaS version v1


Testing standard Kubernetes version policy ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ...
... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...
... returned 0 errors, 0 aborts


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v1: PASSED

Sonobouy results === Collecting results === time="2024-06-17T07:07:25Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy time="2024-06-17T07:07:25Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[sonobuoy-serviceaccount-sonobuoy]" time="2024-06-17T07:07:25Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[sonobuoy-serviceaccount-sonobuoy]" Plugin: e2e Status: passed Total: 7394 Passed: 5 Failed: 0 Skipped: 7389

Plugin: systemd-logs
Status: passed
Total: 6
Passed: 6
Failed: 0
Skipped: 0

Run Details:
API Server version: v1.28.11
Node health: 6/6 (100%)
Pods health: 51/51 (100%)
Errors detected in files:
Errors:
13383 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-hbjpk/logs/kube-apiserver.txt
82 podlogs/kube-system/metrics-server-56cfc8b678-hdbcf/logs/metrics-server.txt
47 podlogs/kube-system/kube-scheduler-pr742-dc6542-c74qr-gt4db/logs/kube-scheduler.txt
43 podlogs/kube-system/etcd-pr742-dc6542-c74qr-hbjpk/logs/etcd.txt
31 podlogs/kube-system/snapshot-controller-7c5dccb849-j4x8l/logs/snapshot-controller.txt
29 podlogs/kube-system/kube-scheduler-pr742-dc6542-c74qr-5z9d9/logs/kube-scheduler.txt
21 podlogs/kube-system/kube-controller-manager-pr742-dc6542-c74qr-hbjpk/logs/kube-controller-manager.txt
18 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-gt4db/logs/kube-apiserver.txt
16 podlogs/kube-system/openstack-cloud-controller-manager-xc78k/logs/openstack-cloud-controller-manager.txt
15 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-5z9d9/logs/kube-apiserver.txt
14 podlogs/kube-system/etcd-pr742-dc6542-c74qr-gt4db/logs/etcd.txt
5 podlogs/kube-system/cilium-22d4p/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-8gkq8/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-fd44s/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-bkwgp/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-xdfbp/logs/cilium-agent.txt
4 podlogs/kube-system/cilium-gkwsz/logs/cilium-agent.txt
3 podlogs/kube-system/cilium-operator-dd95cc587-llzpr/logs/cilium-operator.txt
2 podlogs/kube-system/kube-controller-manager-pr742-dc6542-c74qr-gt4db/logs/kube-controller-manager.txt
2 podlogs/kube-system/kube-controller-manager-pr742-dc6542-c74qr-5z9d9/logs/kube-controller-manager.txt
1 podlogs/kube-system/etcd-pr742-dc6542-c74qr-5z9d9/logs/etcd.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-24nrp/logs/node-driver-registrar.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-2c20d4dcd80641bd/logs/e2e.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-zlkdj/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-b4jjd/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-79wr6/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-kkf26/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-26wn5/logs/node-driver-registrar.txt
1 podlogs/kube-system/snapshot-controller-7c5dccb849-9gfg7/logs/snapshot-controller.txt
Warnings:
352 podlogs/kube-system/etcd-pr742-dc6542-c74qr-hbjpk/logs/etcd.txt
188 podlogs/kube-system/etcd-pr742-dc6542-c74qr-gt4db/logs/etcd.txt
43 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-hbjpk/logs/kube-apiserver.txt
40 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-gt4db/logs/kube-apiserver.txt
37 podlogs/kube-system/kube-apiserver-pr742-dc6542-c74qr-5z9d9/logs/kube-apiserver.txt
27 podlogs/kube-system/kube-scheduler-pr742-dc6542-c74qr-gt4db/logs/kube-scheduler.txt
18 podlogs/kube-system/kube-scheduler-pr742-dc6542-c74qr-5z9d9/logs/kube-scheduler.txt
15 podlogs/kube-system/openstack-cloud-controller-manager-xc78k/logs/openstack-cloud-controller-manager.txt
12 podlogs/kube-system/csi-cinder-nodeplugin-79wr6/logs/node-driver-registrar.txt
8 podlogs/kube-system/csi-cinder-nodeplugin-zlkdj/logs/node-driver-registrar.txt
6 podlogs/kube-system/cilium-xdfbp/logs/cilium-agent.txt
5 podlogs/kube-system/etcd-pr742-dc6542-c74qr-5z9d9/logs/etcd.txt
5 podlogs/kube-system/csi-cinder-controllerplugin-7fb99b86c9-f2cwp/logs/csi-attacher.txt
3 podlogs/kube-system/csi-cinder-controllerplugin-7fb99b86c9-f2cwp/logs/csi-provisioner.txt
3 podlogs/kube-system/csi-cinder-nodeplugin-79wr6/logs/liveness-probe.txt
2 podlogs/kube-system/kube-proxy-nc9s4/logs/kube-proxy.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-zlkdj/logs/liveness-probe.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-26wn5/logs/node-driver-registrar.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-llzpr/logs/cilium-operator.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-f9gd9/logs/openstack-cloud-controller-manager.txt
1 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-2c20d4dcd80641bd/logs/e2e.txt
1 podlogs/kube-system/cilium-22d4p/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-bkwgp/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-8gkq8/logs/cilium-agent.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-tktwn/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/cilium-fd44s/logs/cilium-agent.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-26wn5/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-controllerplugin-7fb99b86c9-f2cwp/logs/csi-snapshotter.txt
time="2024-06-17T07:07:26Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy
time="2024-06-17T07:07:26Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[]"
time="2024-06-17T07:07:26Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[]"

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 7 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 6 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-17 07:07:31 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}
...
...
...
...

Namespace "sonobuoy" has been deleted

Deleted all ClusterRoles and ClusterRoleBindings.
=== Sonobuoy conformance tests passed in 236s ===
make[1]: Leaving directory '/home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform'

Custom ENV config

anti_affinity = false

@chess-knight
Copy link
Member Author

Build succeeded (e2e-quick-test pipeline). https://zuul.scs.community/t/SCS/buildset/cf6946aacb7442c39cbd754c0b0e6311

✔️ k8s-cluster-api-provider-e2e-quick SUCCESS in 39m 15s

Warning:

SCS Compliance results
Sonobouy results

Custom ENV config

anti_affinity = false

After #742 (comment) I tried anti_affinity = false with success.

@chess-knight
Copy link
Member Author

@frosty-geek, I tested the "SCS-2V-4-20s" flavor, but it seems like there are currently only two hosts available for that, is it true?

@frosty-geek
Copy link
Member

It seems like there are only two hosts available for "SCS-2V-4-20s" flavor right now, but we have 3 control-plane nodes with an anti-affinity policy here.

I checked and yes atm that is correct, we're in the process of fixing that with a new hardware environment (scs2) which will hopefully hit ~mid july

There is not enough hosts for local ssd flavors

Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
@scszuulapp
Copy link

scszuulapp bot commented Jun 20, 2024

Build failed (e2e-quick-test pipeline).
https://zuul.scs.community/t/SCS/buildset/764339f79d5847f095dc0a7f6e5c9d8f

k8s-cluster-api-provider-e2e-quick FAILURE in 44m 50s

Warning:

SCS Compliance results Testing SCS Compatible KaaS version v2 ******************************************************* Testing standard Kubernetes version policy ... Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ... INFO: Checking cluster specified by default context in /home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform/pr742-9516a8.yaml.gx-scs-zuul. INFO: The K8s cluster version 1.28.11 of cluster 'pr742-9516a8-admin@pr742-9516a8' is still in the recency time window.

... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...

WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The control nodes are distributed across 2 host-ids.
WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The worker nodes are distributed across 3 host-ids.
... returned 0 errors, 0 aborts


Testing standard CNCF Kubernetes conformance ...
Reference: https://github.com/cncf/k8s-conformance/tree/master ...
WARNING: No check tool specified for CNCF Kubernetes conformance


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v2: PASSED
Testing SCS Compatible KaaS version v1


Testing standard Kubernetes version policy ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ...
... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...

CRITICAL: [Errno 2] No such file or directory: '/home/ubuntu/scs-compliance/Tests/kaas/k8s-node-distribution/k8s-node-distribution-check.py'
... returned 0 errors, 1 aborts


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v1: 1 ERRORS

Sonobouy results === Collecting results === time="2024-06-20T12:39:31Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy time="2024-06-20T12:39:31Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[sonobuoy-serviceaccount-sonobuoy]" time="2024-06-20T12:39:31Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[sonobuoy-serviceaccount-sonobuoy]" Plugin: e2e Status: passed Total: 7394 Passed: 5 Failed: 0 Skipped: 7389

Plugin: systemd-logs
Status: passed
Total: 6
Passed: 6
Failed: 0
Skipped: 0

Run Details:
API Server version: v1.28.11
Node health: 6/6 (100%)
Pods health: 51/51 (100%)
Errors detected in files:
Errors:
31955 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-wwh2m/logs/kube-apiserver.txt
118 podlogs/kube-system/etcd-pr742-9516a8-vhnp4-wwh2m/logs/etcd.txt
57 podlogs/kube-system/kube-scheduler-pr742-9516a8-vhnp4-c6qck/logs/kube-scheduler.txt
54 podlogs/kube-system/snapshot-controller-7c5dccb849-q28b5/logs/snapshot-controller.txt
32 podlogs/kube-system/kube-scheduler-pr742-9516a8-vhnp4-mfq56/logs/kube-scheduler.txt
14 podlogs/kube-system/openstack-cloud-controller-manager-lb4dp/logs/openstack-cloud-controller-manager.txt
11 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-mfq56/logs/kube-apiserver.txt
11 podlogs/kube-system/kube-controller-manager-pr742-9516a8-vhnp4-wwh2m/logs/kube-controller-manager.txt
8 podlogs/kube-system/etcd-pr742-9516a8-vhnp4-c6qck/logs/etcd.txt
7 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-c6qck/logs/kube-apiserver.txt
5 podlogs/kube-system/cilium-5zd4k/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-5v6hp/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-bg6sm/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-qjqdt/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-cb7f2/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-jz7wn/logs/cilium-agent.txt
4 podlogs/kube-system/kube-controller-manager-pr742-9516a8-vhnp4-c6qck/logs/kube-controller-manager.txt
3 podlogs/kube-system/kube-proxy-2hc9c/logs/kube-proxy.txt
2 podlogs/kube-system/kube-controller-manager-pr742-9516a8-vhnp4-mfq56/logs/kube-controller-manager.txt
1 podlogs/kube-system/snapshot-controller-7c5dccb849-dfq9l/logs/snapshot-controller.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-v72g5/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-7dd25/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-kh9xb/logs/node-driver-registrar.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-12ca290a8b4d4618/logs/e2e.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-v72g5/logs/liveness-probe.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-gh67g/logs/cilium-operator.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-jxcn5/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-xznb6/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-g6gzm/logs/node-driver-registrar.txt
Warnings:
715 podlogs/kube-system/etcd-pr742-9516a8-vhnp4-wwh2m/logs/etcd.txt
372 podlogs/kube-system/etcd-pr742-9516a8-vhnp4-c6qck/logs/etcd.txt
45 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-wwh2m/logs/kube-apiserver.txt
38 podlogs/kube-system/kube-scheduler-pr742-9516a8-vhnp4-c6qck/logs/kube-scheduler.txt
36 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-c6qck/logs/kube-apiserver.txt
34 podlogs/kube-system/kube-scheduler-pr742-9516a8-vhnp4-mfq56/logs/kube-scheduler.txt
33 podlogs/kube-system/kube-apiserver-pr742-9516a8-vhnp4-mfq56/logs/kube-apiserver.txt
11 podlogs/kube-system/csi-cinder-nodeplugin-v72g5/logs/node-driver-registrar.txt
10 podlogs/kube-system/openstack-cloud-controller-manager-lb4dp/logs/openstack-cloud-controller-manager.txt
9 podlogs/kube-system/csi-cinder-controllerplugin-5c94c59fb5-hhf6k/logs/csi-attacher.txt
8 podlogs/kube-system/etcd-pr742-9516a8-vhnp4-mfq56/logs/etcd.txt
6 podlogs/kube-system/csi-cinder-controllerplugin-5c94c59fb5-hhf6k/logs/csi-provisioner.txt
6 podlogs/kube-system/cilium-5zd4k/logs/cilium-agent.txt
6 podlogs/kube-system/csi-cinder-nodeplugin-jxcn5/logs/node-driver-registrar.txt
3 podlogs/kube-system/csi-cinder-nodeplugin-xznb6/logs/node-driver-registrar.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-xznb6/logs/liveness-probe.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-v72g5/logs/liveness-probe.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-gh67g/logs/cilium-operator.txt
1 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
1 podlogs/kube-system/cilium-5v6hp/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-cb7f2/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-jz7wn/logs/cilium-agent.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-12ca290a8b4d4618/logs/e2e.txt
1 podlogs/kube-system/kube-proxy-2hc9c/logs/kube-proxy.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-jxcn5/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-controllerplugin-5c94c59fb5-hhf6k/logs/csi-snapshotter.txt
1 podlogs/kube-system/cilium-qjqdt/logs/cilium-agent.txt
1 podlogs/kube-system/cilium-bg6sm/logs/cilium-agent.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-6vrhn/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-g2564/logs/openstack-cloud-controller-manager.txt
time="2024-06-20T12:39:33Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy
time="2024-06-20T12:39:33Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[]"
time="2024-06-20T12:39:33Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[]"

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 7 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}
...
...
...
...

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 2 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:True LastTransitionTime:2024-06-20 12:40:04 +0000 UTC Reason:ContentDeletionFailed Message:Failed to delete all resource types, 1 remaining: unexpected items still remain in namespace: sonobuoy for gvr: /v1, Resource=pods} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 1 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-20 12:39:37 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}
...

Namespace "sonobuoy" has been deleted

Deleted all ClusterRoles and ClusterRoleBindings.
=== Sonobuoy conformance tests passed in 184s ===
make[1]: Leaving directory '/home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform'

Signed-off-by: Roman Hros <roman.hros@dnation.cloud>
@scszuulapp
Copy link

scszuulapp bot commented Jun 21, 2024

Build failed (e2e-test pipeline).
https://zuul.scs.community/t/SCS/buildset/458b09296bc9451e8396e2059bf76d6e

k8s-cluster-api-provider-e2e-conformance FAILURE in 2h 38m 37s

Warning:

SCS Compliance results INFO: Forced version v2 not (yet) stable Testing SCS Compatible KaaS version v2 ******************************************************* Testing standard Kubernetes version policy ... Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ... INFO: Checking cluster specified by default context in /home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform/pr742-1d3dfd.yaml.gx-scs-zuul. INFO: The K8s cluster version 1.28.11 of cluster 'pr742-1d3dfd-admin@pr742-1d3dfd' is still in the recency time window.

... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...

WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple host-ids or labels aren't set correctly across nodes.
ERROR: The distribution of nodes described in the standard couldn't be detected.
... returned 1 errors, 0 aborts


Testing standard CNCF Kubernetes conformance ...
Reference: https://github.com/cncf/k8s-conformance/tree/master ...
WARNING: No check tool specified for CNCF Kubernetes conformance


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v2: 1 ERRORS

Sonobouy results === Collecting results === time="2024-06-21T14:48:51Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy time="2024-06-21T14:48:51Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[sonobuoy-serviceaccount-sonobuoy]" time="2024-06-21T14:48:51Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[sonobuoy-serviceaccount-sonobuoy]" Plugin: e2e Status: passed Total: 7394 Passed: 384 Failed: 0 Skipped: 7010

Plugin: systemd-logs
Status: passed
Total: 4
Passed: 4
Failed: 0
Skipped: 0

Run Details:
API Server version: v1.28.11
Node health: 6/6 (100%)
Pods health: 51/51 (100%)
Errors detected in files:
Errors:
25558 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-7cgbs/logs/kube-apiserver.txt
2835 podlogs/kube-system/kube-controller-manager-pr742-1d3dfd-mn55k-7cgbs/logs/kube-controller-manager.txt
935 podlogs/kube-system/snapshot-controller-7c5dccb849-b4kjb/logs/snapshot-controller.txt
762 podlogs/kube-system/cilium-8vwxt/logs/cilium-agent.txt
443 podlogs/kube-system/cilium-99njv/logs/cilium-agent.txt
412 podlogs/kube-system/cilium-h6hqf/logs/cilium-agent.txt
274 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-7cgbs/logs/etcd.txt
200 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-jxdn4/logs/kube-apiserver.txt
131 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-jxdn4/logs/etcd.txt
56 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
42 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-dr75r/logs/kube-apiserver.txt
33 podlogs/kube-system/kube-scheduler-pr742-1d3dfd-mn55k-dr75r/logs/kube-scheduler.txt
27 podlogs/kube-system/kube-scheduler-pr742-1d3dfd-mn55k-7cgbs/logs/kube-scheduler.txt
12 podlogs/kube-system/kube-proxy-hmd4z/logs/kube-proxy.txt
8 podlogs/kube-system/metrics-server-56cfc8b678-4zkx9/logs/metrics-server.txt
8 podlogs/kube-system/kube-proxy-b7ppr/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-cbg42/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-ls285/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-qzvjg/logs/kube-proxy.txt
8 podlogs/kube-system/cilium-9ssmb/logs/cilium-agent.txt
8 podlogs/kube-system/kube-proxy-ddfjf/logs/kube-proxy.txt
7 podlogs/kube-system/cilium-zkrqf/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-r9clz/logs/cilium-agent.txt
3 podlogs/sonobuoy/sonobuoy-systemd-logs-daemon-set-792a88dac3ed4a72-rnmr4/logs/sonobuoy-worker.txt
3 podlogs/sonobuoy/sonobuoy-systemd-logs-daemon-set-792a88dac3ed4a72-zzkkf/logs/sonobuoy-worker.txt
2 podlogs/kube-system/kube-controller-manager-pr742-1d3dfd-mn55k-dr75r/logs/kube-controller-manager.txt
2 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-dr75r/logs/etcd.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-fglt8/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-fttkj/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-z8r87/logs/node-driver-registrar.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-6d74e1f15b6f4dc4/logs/e2e.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-tb8hz/logs/cilium-operator.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-6pzzn/logs/node-driver-registrar.txt
1 podlogs/kube-system/kube-controller-manager-pr742-1d3dfd-mn55k-jxdn4/logs/kube-controller-manager.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-pmssn/logs/node-driver-registrar.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-vpn68/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-9jm65/logs/node-driver-registrar.txt
Warnings:
2740 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-7cgbs/logs/etcd.txt
2185 podlogs/kube-system/cilium-h6hqf/logs/cilium-agent.txt
1351 podlogs/kube-system/kube-controller-manager-pr742-1d3dfd-mn55k-7cgbs/logs/kube-controller-manager.txt
1178 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-jxdn4/logs/etcd.txt
1020 podlogs/kube-system/cilium-99njv/logs/cilium-agent.txt
929 podlogs/kube-system/cilium-8vwxt/logs/cilium-agent.txt
78 podlogs/kube-system/etcd-pr742-1d3dfd-mn55k-dr75r/logs/etcd.txt
72 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-7cgbs/logs/kube-apiserver.txt
68 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-jxdn4/logs/kube-apiserver.txt
56 podlogs/kube-system/kube-apiserver-pr742-1d3dfd-mn55k-dr75r/logs/kube-apiserver.txt
26 podlogs/kube-system/kube-scheduler-pr742-1d3dfd-mn55k-dr75r/logs/kube-scheduler.txt
18 podlogs/kube-system/kube-scheduler-pr742-1d3dfd-mn55k-7cgbs/logs/kube-scheduler.txt
12 podlogs/kube-system/cilium-9ssmb/logs/cilium-agent.txt
11 podlogs/kube-system/cilium-zkrqf/logs/cilium-agent.txt
9 podlogs/kube-system/cilium-r9clz/logs/cilium-agent.txt
4 podlogs/kube-system/csi-cinder-nodeplugin-fttkj/logs/node-driver-registrar.txt
4 podlogs/kube-system/csi-cinder-nodeplugin-9jm65/logs/node-driver-registrar.txt
4 podlogs/kube-system/openstack-cloud-controller-manager-vpn68/logs/openstack-cloud-controller-manager.txt
3 podlogs/kube-system/csi-cinder-controllerplugin-74f84fbddf-f4jdn/logs/csi-attacher.txt
3 podlogs/kube-system/csi-cinder-nodeplugin-pmssn/logs/node-driver-registrar.txt
2 podlogs/kube-system/csi-cinder-controllerplugin-74f84fbddf-f4jdn/logs/csi-provisioner.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-5k2h7/logs/openstack-cloud-controller-manager.txt
1 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
1 podlogs/sonobuoy/sonobuoy-systemd-logs-daemon-set-792a88dac3ed4a72-rnmr4/logs/sonobuoy-worker.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-6d74e1f15b6f4dc4/logs/e2e.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-6pzzn/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-z8r87/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-pmssn/logs/liveness-probe.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-5zpsc/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-9jm65/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-z8r87/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-fglt8/logs/node-driver-registrar.txt
1 podlogs/sonobuoy/sonobuoy-systemd-logs-daemon-set-792a88dac3ed4a72-zzkkf/logs/sonobuoy-worker.txt
time="2024-06-21T14:48:52Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy
time="2024-06-21T14:48:52Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[]"
time="2024-06-21T14:48:52Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[]"

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 5 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 4 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-21 14:48:56 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}
...
...
...
...

Namespace "sonobuoy" has been deleted

Deleted all ClusterRoles and ClusterRoleBindings.

All E2E namespaces deleted
=== Sonobuoy conformance tests passed in 7413s ===
make[1]: Leaving directory '/home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform'

@scszuulapp
Copy link

scszuulapp bot commented Jun 24, 2024

Build canceled (e2e-test pipeline).

@scszuulapp
Copy link

scszuulapp bot commented Jun 24, 2024

Build succeeded (e2e-test pipeline).
https://zuul.scs.community/t/SCS/buildset/02ac850917644b76bf4b1882db6fdd30

✔️ k8s-cluster-api-provider-e2e-conformance SUCCESS in 2h 47m 02s

Warning:

SCS Compliance results INFO: Forced version v2 not (yet) stable Testing SCS Compatible KaaS version v2 ******************************************************* Testing standard Kubernetes version policy ... Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0210-v2-k8s-version-policy.md ... INFO: Checking cluster specified by default context in /home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform/pr742-e24d5b.yaml.gx-scs-zuul. INFO: The K8s cluster version 1.28.11 of cluster 'pr742-e24d5b-admin@pr742-e24d5b' is still in the recency time window.

... returned 0 errors, 0 aborts


Testing standard Kubernetes node distribution and availability ...
Reference: https://raw.githubusercontent.com/SovereignCloudStack/standards/main/Standards/scs-0214-v1-k8s-node-distribution.md ...

WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The control nodes are distributed across 2 host-ids.
WARNING: There seems to be no distribution across multiple regions or labels aren't set correctly across nodes.
WARNING: There seems to be no distribution across multiple zones or labels aren't set correctly across nodes.
INFO: The worker nodes are distributed across 3 host-ids.
... returned 0 errors, 0 aborts


Testing standard CNCF Kubernetes conformance ...
Reference: https://github.com/cncf/k8s-conformance/tree/master ...
WARNING: No check tool specified for CNCF Kubernetes conformance


Verdict for subject KaaS_V1, SCS Compatible KaaS, version v2: PASSED

Sonobouy results === Collecting results === time="2024-06-24T12:35:07Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy time="2024-06-24T12:35:07Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[sonobuoy-serviceaccount-sonobuoy]" time="2024-06-24T12:35:07Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[sonobuoy-serviceaccount-sonobuoy]" Plugin: e2e Status: passed Total: 7394 Passed: 384 Failed: 0 Skipped: 7010

Plugin: systemd-logs
Status: passed
Total: 6
Passed: 6
Failed: 0
Skipped: 0

Run Details:
API Server version: v1.28.11
Node health: 6/6 (100%)
Pods health: 51/51 (100%)
Errors detected in files:
Errors:
18683 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-8pfqm/logs/kube-apiserver.txt
1005 podlogs/kube-system/snapshot-controller-7c5dccb849-c774t/logs/snapshot-controller.txt
910 podlogs/kube-system/kube-controller-manager-pr742-e24d5b-k2962-qqhq9/logs/kube-controller-manager.txt
427 podlogs/kube-system/cilium-s9tb6/logs/cilium-agent.txt
354 podlogs/kube-system/cilium-m69nz/logs/cilium-agent.txt
335 podlogs/kube-system/cilium-gf27g/logs/cilium-agent.txt
233 podlogs/kube-system/etcd-pr742-e24d5b-k2962-8pfqm/logs/etcd.txt
79 podlogs/kube-system/kube-scheduler-pr742-e24d5b-k2962-qqhq9/logs/kube-scheduler.txt
70 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-qqhq9/logs/kube-apiserver.txt
37 podlogs/kube-system/etcd-pr742-e24d5b-k2962-qqhq9/logs/etcd.txt
31 podlogs/kube-system/kube-scheduler-pr742-e24d5b-k2962-fp4h4/logs/kube-scheduler.txt
22 podlogs/kube-system/openstack-cloud-controller-manager-mw5gm/logs/openstack-cloud-controller-manager.txt
18 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-fp4h4/logs/kube-apiserver.txt
10 podlogs/kube-system/kube-proxy-nl58c/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-bxl68/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-8cpvg/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-vmsdf/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-bdbhw/logs/kube-proxy.txt
8 podlogs/kube-system/kube-proxy-6bcn4/logs/kube-proxy.txt
8 podlogs/kube-system/cilium-x4ktk/logs/cilium-agent.txt
6 podlogs/kube-system/cilium-nvwmr/logs/cilium-agent.txt
5 podlogs/kube-system/cilium-m27bm/logs/cilium-agent.txt
2 podlogs/kube-system/metrics-server-56cfc8b678-zcz7w/logs/metrics-server.txt
2 podlogs/kube-system/kube-controller-manager-pr742-e24d5b-k2962-fp4h4/logs/kube-controller-manager.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-p8z5z/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-qwm8x/logs/liveness-probe.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-nw5jp/logs/cilium-operator.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-7ph8z/logs/node-driver-registrar.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-334c8ed4b4cc429a/logs/e2e.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-qwm8x/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-ctztq/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-lffgr/logs/node-driver-registrar.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-28t6h/logs/node-driver-registrar.txt
Warnings:
1960 podlogs/kube-system/cilium-s9tb6/logs/cilium-agent.txt
1419 podlogs/kube-system/etcd-pr742-e24d5b-k2962-8pfqm/logs/etcd.txt
1164 podlogs/kube-system/cilium-m69nz/logs/cilium-agent.txt
768 podlogs/kube-system/etcd-pr742-e24d5b-k2962-qqhq9/logs/etcd.txt
694 podlogs/kube-system/cilium-gf27g/logs/cilium-agent.txt
380 podlogs/kube-system/kube-controller-manager-pr742-e24d5b-k2962-qqhq9/logs/kube-controller-manager.txt
77 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-qqhq9/logs/kube-apiserver.txt
68 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-8pfqm/logs/kube-apiserver.txt
47 podlogs/kube-system/kube-scheduler-pr742-e24d5b-k2962-qqhq9/logs/kube-scheduler.txt
37 podlogs/kube-system/kube-apiserver-pr742-e24d5b-k2962-fp4h4/logs/kube-apiserver.txt
32 podlogs/kube-system/kube-scheduler-pr742-e24d5b-k2962-fp4h4/logs/kube-scheduler.txt
14 podlogs/kube-system/openstack-cloud-controller-manager-mw5gm/logs/openstack-cloud-controller-manager.txt
11 podlogs/kube-system/cilium-x4ktk/logs/cilium-agent.txt
11 podlogs/kube-system/csi-cinder-nodeplugin-qwm8x/logs/node-driver-registrar.txt
9 podlogs/kube-system/cilium-nvwmr/logs/cilium-agent.txt
8 podlogs/kube-system/cilium-m27bm/logs/cilium-agent.txt
8 podlogs/kube-system/etcd-pr742-e24d5b-k2962-fp4h4/logs/etcd.txt
7 podlogs/kube-system/csi-cinder-nodeplugin-28t6h/logs/node-driver-registrar.txt
6 podlogs/kube-system/csi-cinder-nodeplugin-7ph8z/logs/node-driver-registrar.txt
3 podlogs/kube-system/csi-cinder-controllerplugin-974fbfdf7-fm878/logs/csi-attacher.txt
2 podlogs/kube-system/kube-proxy-nl58c/logs/kube-proxy.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-7ph8z/logs/liveness-probe.txt
2 podlogs/kube-system/csi-cinder-controllerplugin-974fbfdf7-fm878/logs/csi-provisioner.txt
2 podlogs/kube-system/csi-cinder-nodeplugin-qwm8x/logs/liveness-probe.txt
1 podlogs/kube-system/csi-cinder-nodeplugin-28t6h/logs/liveness-probe.txt
1 podlogs/sonobuoy/sonobuoy/logs/kube-sonobuoy.txt
1 podlogs/kube-system/csi-cinder-controllerplugin-974fbfdf7-fm878/logs/csi-snapshotter.txt
1 podlogs/sonobuoy/sonobuoy-e2e-job-334c8ed4b4cc429a/logs/e2e.txt
1 podlogs/kube-system/cilium-operator-dd95cc587-nw5jp/logs/cilium-operator.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-wgkfl/logs/openstack-cloud-controller-manager.txt
1 podlogs/kube-system/openstack-cloud-controller-manager-2gxkf/logs/openstack-cloud-controller-manager.txt
time="2024-06-24T12:35:09Z" level=info msg="delete request issued" dry-run=false kind=namespace namespace=sonobuoy
time="2024-06-24T12:35:09Z" level=info msg="delete request issued" dry-run=false kind=clusterrolebindings names="[]"
time="2024-06-24T12:35:09Z" level=info msg="delete request issued" dry-run=false kind=clusterroles names="[]"

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 7 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has status {Phase:Terminating Conditions:[{Type:NamespaceDeletionDiscoveryFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ResourcesDiscovered Message:All resources successfully discovered} {Type:NamespaceDeletionGroupVersionParsingFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ParsedGroupVersions Message:All legacy kube types successfully parsed} {Type:NamespaceDeletionContentFailure Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ContentDeleted Message:All content successfully deleted, may be waiting on finalization} {Type:NamespaceContentRemaining Status:True LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:SomeResourcesRemain Message:Some resources are remaining: pods. has 6 resource instances} {Type:NamespaceFinalizersRemaining Status:False LastTransitionTime:2024-06-24 12:35:13 +0000 UTC Reason:ContentHasNoFinalizers Message:All content-preserving finalizers finished}]}

Namespace "sonobuoy" has been deleted

Deleted all ClusterRoles and ClusterRoleBindings.

All E2E namespaces deleted
=== Sonobuoy conformance tests passed in 7940s ===
make[1]: Leaving directory '/home/ubuntu/src/github.com/SovereignCloudStack/k8s-cluster-api-provider/terraform'

@chess-knight chess-knight marked this pull request as ready for review June 24, 2024 14:16
@chess-knight
Copy link
Member Author

Soft-anti-affinity was used in #742 (comment) but control planes were scheduled on the same host. It was not the case in #742 (comment).

Copy link
Member

@matofeder matofeder left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested by the e2e pipeline and works as expected.
The change looks good to me

Copy link
Contributor

@michal-gubricky michal-gubricky left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@chess-knight chess-knight merged commit 3a40a04 into main Jun 26, 2024
7 checks passed
@chess-knight chess-knight deleted the feat/server_metadata branch June 26, 2024 13:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix daily e2e pipeline
4 participants