Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VSCode CloudCode Run/debug just shows error "Cannot read property 'artifacts' of undefined #398

Closed
samdoeswork opened this issue Mar 10, 2021 · 10 comments
Assignees
Labels
area/skaffold in-next-release kind/bug Something isn't working priority/p1 Needs action soon

Comments

@samdoeswork
Copy link

samdoeswork commented Mar 10, 2021

Version information
Cloud Code Extension version: 1.9.0

VSCode version: 1.51.1

OS: Windows 10

Cloud SDK:

Skaffold: v1.20.0

Kubectl:
Client Version: version.Info{Major:"1", Minor:"17+", GitVersion:"v1.17.17-dispatcher", GitCommit:"a39a896b5018d0c800124a36757433c660fd0880", GitTreeState:"clean", BuildDate:"2021-01-28T22:25:50Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"windows/amd64"}
Server Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.3", GitCommit:"1e11e4a2108024935ecfcb2912226cedeafd99df", GitTreeState:"clean", BuildDate:"2020-10-14T12:41:49Z", GoVersion:"go1.15.2", Compiler:"gc", Platform:"linux/amd64"}

Description:
CloudCode detects a skaffold.yaml file and creates a Run/Debug config for me. But when I try to run it, an extension error-toast appears in the bottom right with the message "Cannot read property 'artifacts' of undefined.

There doesn't seem to be additional information in any of the output channels or the debug console.

Repro step:
For me - just run/debug with the default created config. Skaffold does work when run in my workspace root with:
skaffold dev --profile=dev

This is the configuration created:
{
"name": "Kubernetes: Run/Debug - dev",
"type": "cloudcode.kubernetes",
"request": "launch",
"skaffoldConfig": "${workspaceFolder}\skaffold.yaml",
"profile": "dev",
"watch": true,
"cleanUp": true,
"portForward": true
}

Here's the full skaffold.yaml in case it's useful (again - running the command directly works fine).

apiVersion: skaffold/v2beta1
kind: Config
metadata:
name: jaf
profiles:

  • name: dev
    build:
    artifacts:
    - image: gcr.io/justaddfeatures-203400/mainbatch_dev
    context: k8s/mainBatch
    - image: gcr.io/justaddfeatures-203400/sendbatch_dev
    context: k8s/sendBatch
    tagPolicy:
    sha256: {}
    deploy:
    kubeContext: docker-desktop
    statusCheckDeadlineSeconds: 300
    kubectl:
    manifests:
    - k8s/mainBatch/kube/mainBatch_dev.yaml
    - k8s/sendBatch/kube/sendBatch_dev.yaml
  • name: test
    build:
    artifacts:
    - image: gcr.io/justaddfeatures-203400/mainbatch_test
    context: k8s/mainBatch
    - image: gcr.io/justaddfeatures-203400/sendbatch_test
    context: k8s/sendBatch
    tagPolicy:
    sha256: {}
    deploy:
    kubeContext: gke_justaddfeatures-203400_europe-west1-d_im-main
    statusCheckDeadlineSeconds: 300
    kubectl:
    manifests:
    - k8s/mainBatch/kube/mainBatch_test.yaml
    - k8s/sendBatch/kube/sendBatch_test.yaml
  • name: live
    build:
    artifacts:
    - image: gcr.io/justaddfeatures-203400/mainbatch_live
    context: k8s/mainBatch
    - image: gcr.io/justaddfeatures-203400/sendbatch_live
    context: k8s/sendBatch
    tagPolicy:
    sha256: {}
    deploy:
    statusCheckDeadlineSeconds: 300
    kubeContext: gke_justaddfeatures-203400_europe-west1-d_im-main
    kubectl:
    manifests:
    - k8s/mainBatch/kube/mainBatch_live_leader.yaml
    - k8s/mainBatch/kube/mainBatch_live_replica.yaml
    - k8s/sendBatch/kube/sendBatch_live.yaml
@quoctruong
Copy link
Contributor

@samdoeswork. Do you mind sharing the logs in the output channel for the launch as well?

@quoctruong
Copy link
Contributor

@samdoeswork I suspect this is because we are trying to detect if there is an artifacts property under the default profile. Can you try adding one and see if that resolves the error?

@dhodun
Copy link

dhodun commented Mar 22, 2021

Ran into this issue as well.

In our case, we have no artifacts defined in the root skaffold.yaml, all the artifacts are defined in downstream required skaffold.yaml. These are all using default profile - I fixed the issue by adding a dummy container to the root skaffold.yaml.

@dhodun
Copy link

dhodun commented Mar 22, 2021

For reference, here is the root skaffold.yaml without the dummy container that fixed it:

# Master Skaffold
apiVersion: skaffold/v2beta12
kind: Config
metadata:
  name: microservices
requires:
- path: ./microservices/authentication
  configs: [authentication]
  activeProfiles: 
  - name: custom
    activatedBy: [custom] 

- path: ./microservices/dashboard
  configs: [dashboard]
  activeProfiles: 
  - name: custom
    activatedBy: [custom] 

- path: ./microservices/messages
  configs: [messages]
  activeProfiles: 
  - name: custom
    activatedBy: [custom] 

- path: ./microservices/notes
  configs: [notes]
  activeProfiles: 
  - name: custom
    activatedBy: [custom] 

- path: ./microservices/utils
  configs: [utils]
  activeProfiles: 
  - name: custom
    activatedBy: [custom] 

profiles:
- name: custom
- name: dev

@sivakku sivakku added in-next-release and removed question Further information is requested labels Mar 30, 2021
@inostia
Copy link

inostia commented Mar 31, 2021

@dhodun what exactly did you do to fix this error? I'm encountering the same thing and looking to add this "dummy" container.

This is my single skaffold.yaml:

apiVersion: skaffold/v2beta12
kind: Config
metadata:
  name: skaffold-config
profiles:
- name: dev
  build:
    artifacts:
    - image: portal
      docker:
        dockerfile: Dockerfile
  deploy:
    kustomize:
      paths:
      - kubernetes/overlays/local
- name: prod
  deploy:
    kustomize:
      paths:
      - kubernetes/overlays/prod

edit: nevermind, I see what you mean. I moved profiles[0].build into the root of the skaffold config and it seems to be building without the above error.

@ChaseMor
Copy link

ChaseMor commented Apr 1, 2021

Yeah, just to be clear in case this is troubling anyone else, the bug happens when there are no build artifacts in the root skaffold config. This is most often the case when using and specifying profiles. This bug will likely be fixed in our next release, but until then you should be able to specify any build artifact, even a "dummy" one that isn't used at all, in the root skaffold config to bypass this bug. Note: if you're using docker or buildpacks, this "dummy" build artifact should be either a docker or buildpacks build artifact.

@dhodun
Copy link

dhodun commented Apr 1, 2021

Ya, you need some artifact in a build tag in root.

I added

build:
- image: dummy

Then a 'dummy' folder with Dockerfile: "FROM alpine" and that was it. Super tiny container, gets built, doesn't interact with the rest of the app.

@j-windsor j-windsor added kind/bug Something isn't working priority/p1 Needs action soon labels Apr 2, 2021
@sivakku
Copy link
Contributor

sivakku commented Apr 7, 2021

The fix for this is going to be our April month end release. We will let you once we have insider build ready with the fix for this one.

1 similar comment
@sivakku
Copy link
Contributor

sivakku commented Apr 14, 2021

The fix for this is going to be our April month end release. We will let you once we have insider build ready with the fix for this one.

@sivakku
Copy link
Contributor

sivakku commented Apr 23, 2021

Closing this one as this is part of the April end release.

@sivakku sivakku closed this as completed Apr 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/skaffold in-next-release kind/bug Something isn't working priority/p1 Needs action soon
Projects
None yet
Development

No branches or pull requests

7 participants