Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

knative not working on microk8s 1.22 #2720

Closed
mcopik opened this issue Nov 9, 2021 · 2 comments
Closed

knative not working on microk8s 1.22 #2720

mcopik opened this issue Nov 9, 2021 · 2 comments
Labels

Comments

@mcopik
Copy link

mcopik commented Nov 9, 2021

Hi!

I tried to enable knative on a fresh install following the tutorial on Ubuntu website and everything seems to work correctly - install.log.

Unfortunately, when adding the most basic example of a service, it cannot create a correct service - DOMAIN is none and there's no URL for the service.

service.yaml

apiVersion: serving.knative.dev/v1                                                                                                                                                                                                                                                                                            
kind: Service                                                      
metadata:                                                          
  name: helloworld-python                                   
  namespace: default                                        
spec:                                                       
  template:                                                        
    spec:                                                   
      containers:                                           
        - image: docker.io/mcopik/helloworld-python         
          env:                                                     
            - name: TARGET                                  
              value: "Python Sample v1"    

kubectl get ksvc helloworld-python --output=custom-columns=NAME:.metadata.name,DOMAIN:.status.domain

NAME                DOMAIN
helloworld-python   <none>

microk8s kubectl get pods -A -o wide

NAMESPACE          NAME                                                  READY   STATUS    RESTARTS   AGE    IP            NODE          NOMINATED NODE   READINESS GATES
kube-system        calico-kube-controllers-7f66d7cf65-z7882              1/1     Running   0          36m    10.1.77.131   master-node   <none>           <none>
kube-system        coredns-7f9c69c78c-9dmg6                              1/1     Running   0          24m    10.1.77.132   master-node   <none>           <none>
kube-system        calico-node-gkqfl                                     1/1     Running   0          36m    10.6.39.138   master-node   <none>           <none>
istio-system       istiod-568d797f55-hxmdt                               1/1     Running   0          23m    10.1.77.133   master-node   <none>           <none>
istio-system       istio-ingressgateway-8f568d595-l4phb                  1/1     Running   0          23m    10.1.77.135   master-node   <none>           <none>
istio-system       istio-egressgateway-5547fcc8fc-ck5vf                  1/1     Running   0          23m    10.1.77.134   master-node   <none>           <none>
knative-serving    autoscaler-df6856b64-gnnp4                            1/1     Running   0          22m    10.1.77.137   master-node   <none>           <none>
knative-serving    controller-788796f49d-rbpg7                           1/1     Running   0          22m    10.1.77.138   master-node   <none>           <none>
knative-serving    activator-67656dcbbb-xwr8z                            1/1     Running   0          22m    10.1.77.136   master-node   <none>           <none>
knative-serving    domainmapping-webhook-cc646465c-k72tr                 1/1     Running   0          22m    10.1.77.139   master-node   <none>           <none>
knative-serving    domain-mapping-65f58c79dc-qm9qp                       1/1     Running   0          22m    10.1.77.140   master-node   <none>           <none>
knative-serving    webhook-859796bc7-skl47                               1/1     Running   0          22m    10.1.77.141   master-node   <none>           <none>
knative-serving    net-istio-controller-799fb59fbf-h9fm9                 1/1     Running   0          22m    10.1.77.142   master-node   <none>           <none>
knative-serving    net-istio-webhook-5d97d48d5b-9bxvc                    1/1     Running   0          22m    10.1.77.143   master-node   <none>           <none>
knative-eventing   eventing-controller-7995d654c7-w2h2f                  1/1     Running   0          22m    10.1.77.145   master-node   <none>           <none>
knative-eventing   eventing-webhook-fff97b47c-pfd2g                      1/1     Running   0          22m    10.1.77.146   master-node   <none>           <none>
knative-eventing   imc-controller-f466dfff7-bkmh6                        1/1     Running   0          22m    10.1.77.147   master-node   <none>           <none>
knative-eventing   imc-dispatcher-bb46d5779-9skv4                        1/1     Running   0          22m    10.1.77.148   master-node   <none>           <none>
knative-eventing   mt-broker-filter-6fd5b9765c-smc4x                     1/1     Running   0          22m    10.1.77.149   master-node   <none>           <none>
knative-eventing   mt-broker-ingress-5d599d8f87-kjpnk                    1/1     Running   0          22m    10.1.77.150   master-node   <none>           <none>
knative-eventing   mt-broker-controller-7ccfb9874d-4k7vm                 1/1     Running   0          22m    10.1.77.151   master-node   <none>           <none>
default            helloworld-python-00001-deployment-6fbdcfbccc-557g8   2/2     Running   0          10m    10.1.77.154   master-node   <none>           <none>
knative-serving    default-domain--1-9n57t                               0/1     Error     0          22m    10.1.77.144   master-node   <none>           <none>
knative-serving    default-domain--1-p4dql                               1/1     Running   0          107s   10.1.77.155   master-node   <none>           <none>

kubectl get all --namespace default

NAME                                                      READY   STATUS    RESTARTS   AGE
pod/helloworld-python-00001-deployment-6fbdcfbccc-557g8   2/2     Running   0          37m

NAME                                      TYPE        CLUSTER-IP       EXTERNAL-IP   PORT(S)                                      AGE
service/kubernetes                        ClusterIP   10.152.183.1     <none>        443/TCP                                      63m
service/helloworld-python-00001-private   ClusterIP   10.152.183.209   <none>        80/TCP,9090/TCP,9091/TCP,8022/TCP,8012/TCP   37m
service/helloworld-python-00001           ClusterIP   10.152.183.201   <none>        80/TCP                                       37m

NAME                                                 READY   UP-TO-DATE   AVAILABLE   AGE
deployment.apps/helloworld-python-00001-deployment   0/1     0            0           37m

NAME                                                            DESIRED   CURRENT   READY   AGE
replicaset.apps/helloworld-python-00001-deployment-6fbdcfbccc   1         1         1       37m

NAME                                            URL   LATESTCREATED   LATESTREADY   READY   REASON
service.serving.knative.dev/helloworld-python                                               

NAME                                          URL   READY   REASON
route.serving.knative.dev/helloworld-python                 

NAME                                                   CONFIG NAME         K8S SERVICE NAME   GENERATION   READY   REASON   ACTUAL REPLICAS   DESIRED REPLICAS
revision.serving.knative.dev/helloworld-python-00001   helloworld-python                      1                                               

NAME                                                  LATESTCREATED   LATESTREADY   READY   REASON
configuration.serving.knative.dev/helloworld-python   

microk8s status

microk8s is running
high-availability: no
  datastore master nodes: 127.0.0.1:19001
  datastore standby nodes: none
addons:
  enabled:
    dns                  # CoreDNS
    ha-cluster           # Configure high availability on the current node
    istio                # Core Istio service mesh services
    knative              # The Knative framework on Kubernetes.
  disabled:
    ambassador           # Ambassador API Gateway and Ingress
    cilium               # SDN, fast with full network policy
    dashboard            # The Kubernetes dashboard
    fluentd              # Elasticsearch-Fluentd-Kibana logging and monitoring
    gpu                  # Automatic enablement of Nvidia CUDA
    helm                 # Helm 2 - the package manager for Kubernetes
    helm3                # Helm 3 - Kubernetes package manager
    host-access          # Allow Pods connecting to Host services smoothly
    ingress              # Ingress controller for external access
    jaeger               # Kubernetes Jaeger operator with its simple config
    kata                 # Kata Containers is a secure runtime with lightweight VMS
    keda                 # Kubernetes-based Event Driven Autoscaling
    kubeflow             # Kubeflow for easy ML deployments
    linkerd              # Linkerd is a service mesh for Kubernetes and other frameworks
    metallb              # Loadbalancer for your Kubernetes cluster
    metrics-server       # K8s Metrics Server for API access to service metrics
    multus               # Multus CNI enables attaching multiple network interfaces to pods
    openebs              # OpenEBS is the open-source storage solution for Kubernetes
    openfaas             # openfaas serverless framework
    portainer            # Portainer UI for your Kubernetes cluster
    prometheus           # Prometheus operator for monitoring and logging
    rbac                 # Role-Based Access Control for authorisation
    registry             # Private image registry exposed on localhost:32000
    storage              # Storage class; allocates storage from host directory
    traefik              # traefik Ingress controller for external access

microk8s inspect

Inspecting Certificates
Inspecting services
  Service snap.microk8s.daemon-cluster-agent is running
  Service snap.microk8s.daemon-containerd is running
  Service snap.microk8s.daemon-apiserver-kicker is running
  Service snap.microk8s.daemon-kubelite is running
  Copy service arguments to the final report tarball
Inspecting AppArmor configuration
Gathering system information
  Copy processes list to the final report tarball
  Copy snap list to the final report tarball
  Copy VM name (or none) to the final report tarball
  Copy disk usage information to the final report tarball
  Copy memory usage information to the final report tarball
  Copy server uptime to the final report tarball
  Copy current linux distribution to the final report tarball
  Copy openSSL information to the final report tarball
  Copy network configuration to the final report tarball
Inspecting kubernetes cluster
  Inspect kubernetes cluster
Inspecting juju
  Inspect Juju
Inspecting kubeflow
  Inspect Kubeflow

inspection-report-20211109_172050.tar.gz

I inspected the logs of the service controller and I found the following error:

sudo tail /var/log/pods/knative-serving_controller-788796f49d-rbpg7_00d9cc24-3b9a-4d17-b739-2936283b4498/controller/0.log

2021-11-09T17:17:40.855445652+01:00 stderr F {"severity":"INFO","timestamp":"2021-11-09T16:17:40.85527315Z","logger":"controller","caller":"configuration/configuration.go:104","message":"Revision \"helloworld-python-00001\" of configuration is not ready","commit":"c75484e","knative.dev/pod":"controller-788796f49d-rbpg7","knative.dev/controller":"knative.dev.serving.pkg.reconciler.configuration.Reconciler","knative.dev/kind":"serving.knative.dev.Configuration","knative.dev/traceid":"e99dc1e5-4c38-42b7-ba04-648143508138","knative.dev/key":"default/helloworld-python"}
2021-11-09T17:17:40.865510259+01:00 stderr F {"severity":"WARNING","timestamp":"2021-11-09T16:17:40.865348256Z","logger":"controller","caller":"configuration/reconciler.go:287","message":"Failed to update resource status","commit":"c75484e","knative.dev/pod":"controller-788796f49d-rbpg7","knative.dev/controller":"knative.dev.serving.pkg.reconciler.configuration.Reconciler","knative.dev/kind":"serving.knative.dev.Configuration","knative.dev/traceid":"e99dc1e5-4c38-42b7-ba04-648143508138","knative.dev/key":"default/helloworld-python","targetMethod":"ReconcileKind","error":"admission webhook \"webhook.serving.knative.dev\" denied the request: mutation failed: cannot decode incoming new object: json: unknown field \"subresource\""}
2021-11-09T17:17:40.865554468+01:00 stderr F {"severity":"ERROR","timestamp":"2021-11-09T16:17:40.865436299Z","logger":"controller","caller":"controller/controller.go:549","message":"Reconcile error","commit":"c75484e","knative.dev/pod":"controller-788796f49d-rbpg7","knative.dev/controller":"knative.dev.serving.pkg.reconciler.configuration.Reconciler","knative.dev/kind":"serving.knative.dev.Configuration","duration":"10.356986ms","error":"admission webhook \"webhook.serving.knative.dev\" denied the request: mutation failed: cannot decode incoming new object: json: unknown field \"subresource\"","stacktrace":"knative.dev/pkg/controller.(*Impl).handleErr\n\tknative.dev/pkg@v0.0.0-20210622173328-dd0db4b05c80/controller/controller.go:549\nknative.dev/pkg/controller.(*Impl).processNextWorkItem\n\tknative.dev/pkg@v0.0.0-20210622173328-dd0db4b05c80/controller/controller.go:532\nknative.dev/pkg/controller.(*Impl).RunContext.func3\n\tknative.dev/pkg@v0.0.0-20210622173328-dd0db4b05c80/controller/controller.go:468"}
2021-11-09T17:17:40.866220436+01:00 stderr F {"severity":"INFO","timestamp":"2021-11-09T16:17:40.866103059Z","logger":"controller.event-broadcaster","caller":"record/event.go:282","message":"Event(v1.ObjectReference{Kind:\"Configuration\", Namespace:\"default\", Name:\"helloworld-python\", UID:\"38b11bf5-251e-4a09-af7e-25907f1f0027\", APIVersion:\"serving.knative.dev/v1\", ResourceVersion:\"18330\", FieldPath:\"\"}): type: 'Warning' reason: 'UpdateFailed' Failed to update status for \"helloworld-python\": admission webhook \"webhook.serving.knative.dev\" denied the request: mutation failed: cannot decode incoming new object: json: unknown field \"subresource\"","commit":"c75484e","knative.dev/pod":"controller-788796f49d-rbpg7"}

Furthermore, I checked the logs of the failed default-domain--1-9n57t and found the following:

kubectl -n knative-serving  logs default-domain--1-9n57t
W1109 15:28:06.351133       1 client_config.go:614] Neither --kubeconfig nor --master was specified.  Using the inClusterConfig.  This might not work.
{"level":"fatal","ts":1636472886.6482625,"logger":"fallback.default-domain","caller":"default-domain/main.go:199","msg":"Error finding gateway address","error":"timed out waiting for the condition","stacktrace":"main.main\n\tknative.dev/serving/cmd/default-domain/main.go:199\nruntime.main\n\truntime/proc.go:204"}
@balchua
Copy link
Collaborator

balchua commented Nov 16, 2021

@mcopik your error seems to be similar to this.
knative/serving#11448
It looks like the knative version in the addon isnt happy with k8s 1.22+

@stale
Copy link

stale bot commented Nov 22, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the inactive label Nov 22, 2022
@stale stale bot closed this as completed Dec 22, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants