Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enabling Server Side Apply causes deployment failure #2446

Closed
kaz130 opened this issue Jun 7, 2023 · 4 comments
Closed

Enabling Server Side Apply causes deployment failure #2446

kaz130 opened this issue Jun 7, 2023 · 4 comments
Assignees
Labels
area/providers area/server-side-apply kind/bug Some behavior is incorrect or out of spec last-applied-configuration Issues related to the last-applied-configuration annotation resolution/fixed This issue was fixed
Milestone

Comments

@kaz130
Copy link

kaz130 commented Jun 7, 2023

What happened?

Failed to update a Deployment resource after enabling SSA. I got the error below.
The Deployment resource was originally created without SSA.

kubernetes:apps/v1:Deployment (deployment):
  error: 2 errors occurred:
   * the Kubernetes API server reported that "default/test" failed to fully initialize or become live: 'test' timed out waiting to be Ready
   * Attempted to roll forward to new ReplicaSet, but minimum number of Pods did not become live

Expected Behavior

A successful pulumi up.

Steps to reproduce

  1. Create a Deployment resource with the following Pulumi configuration, where SSA is disabled.

Pulumi.yaml

name: pulumi-ssa-test
runtime: yaml
backend:
  url: file://.
options:
  refresh: always
resources:
  K8sProvider:
    type: pulumi:providers:kubernetes
    options:
      version: "3.29.0"
    properties:
      enableServerSideApply: false
      # enableServerSideApply: true
  deployment:
    type: kubernetes:apps/v1:Deployment
    options:
      provider: ${K8sProvider}
    properties:
      metadata:
        name: test
        labels:
          app: nginx
        annotations:
          "pulumi.com/timeoutSeconds": "180"
      spec:
        replicas: 1
        selector:
          matchLabels:
            app: nginx
        template:
          metadata:
            labels:
              app: nginx
          spec:
            containers:
              - image: nginx:1.22
              # - image: nginx:1.23
                name: nginx
                ports:
                  - containerPort: 80
  1. Enable SSA and edit the manifest (nginx:1.22 -> nginx:1.23), then run pulumi up. Deployment succeeds at this time.

  2. Run pulumi up again without any changes.

  3. Then get the following outputs.

$ pulumi up -y
Previewing update (main):
     Type                              Name                  Plan       Info
     pulumi:pulumi:Stack               pulumi-ssa-test-main
     ├─ pulumi:providers:kubernetes    K8sProvider
 ~   └─ kubernetes:apps/v1:Deployment  deployment            update     [diff: ~spec]


Resources:
    ~ 1 to update
    2 unchanged

Updating (main):
     Type                              Name                  Status                  Info
     pulumi:pulumi:Stack               pulumi-ssa-test-main  **failed**              1 error
     ├─ pulumi:providers:kubernetes    K8sProvider
 ~   └─ kubernetes:apps/v1:Deployment  deployment            **updating failed**     [diff: ~spec]; 1 error


Diagnostics:
  pulumi:pulumi:Stack (pulumi-ssa-test-main):
    error: update failed

  kubernetes:apps/v1:Deployment (deployment):
    error: 2 errors occurred:
    	* the Kubernetes API server reported that "default/test" failed to fully initialize or become live: 'test' timed out waiting to be Ready
    	* Attempted to roll forward to new ReplicaSet, but minimum number of Pods did not become live

Resources:
    2 unchanged

Duration: 3m1s

Output of pulumi about

CLI
Version      3.69.0
Go Version   go1.20.4
Go Compiler  gc

Plugins
NAME        VERSION
kubernetes  3.29.0
yaml        unknown

Host
OS       darwin
Version  13.4
Arch     x86_64

This project is written in yaml

Backend
Name           kazuki-MBP.local
URL            file://.
User           kazuki
Organizations

No dependencies found

Pulumi locates its logs in /var/folders/gg/pkxzlyld5n1d_7z8xv1gx_dc0000gp/T/ by default
warning: Failed to get information about the current stack: No current stack

Additional context

If we removed the “last-applied-configuration” annotation from the Deployment resource, the deployment of pulumi up succeeded.
Removing refresh: always option also solved the issue.
Therefore, I suspect that last-applied-configuration, which should not be used when SSA is enabled, is affecting the deployment.

Contributing

Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

@kaz130 kaz130 added kind/bug Some behavior is incorrect or out of spec needs-triage Needs attention from the triage team labels Jun 7, 2023
@thomas11 thomas11 added area/providers and removed needs-triage Needs attention from the triage team labels Jun 7, 2023
@thomas11
Copy link
Contributor

thomas11 commented Jun 7, 2023

Hi @kaz130, thank you for the detailed report! It seems like this issue will be fixed once #2444 is implemented. Fortunately, you have some workarounds in removing the annotation or the refresh setting.

@kaz130
Copy link
Author

kaz130 commented Jun 8, 2023

Thanks for your reply. @thomas11

I would like to ask you the following questions.

  • When will the fix be completed?
  • Are there any plans to backport to v3?

@thomas11
Copy link
Contributor

thomas11 commented Jun 8, 2023

Hi @kaz130, there is already a draft PR #2445 so it will probably land rather soon. However, we're not planning to backport to v3 since it's a somewhat major change.

@lblackstone
Copy link
Member

I was not able to reproduce the error with the v4.0.0-rc.1 provider, so I'm going to mark this as fixed. The official release is expected in the next week or so.

@lblackstone lblackstone added the last-applied-configuration Issues related to the last-applied-configuration annotation label Jul 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/providers area/server-side-apply kind/bug Some behavior is incorrect or out of spec last-applied-configuration Issues related to the last-applied-configuration annotation resolution/fixed This issue was fixed
Projects
None yet
Development

No branches or pull requests

3 participants