Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors in the test logs #865

Closed
IronPan opened this issue Feb 27, 2019 · 4 comments
Closed

Errors in the test logs #865

IronPan opened this issue Feb 27, 2019 · 4 comments

Comments

@IronPan
Copy link
Member

IronPan commented Feb 27, 2019

There are multiple errors in the test log
https://gubernator.k8s.io/build/kubernetes-jenkins/pr-logs/pull/kubeflow_pipelines/862/kubeflow-pipeline-e2e-test/927?log#log


/home/prow/go/src/github.com/kubeflow/pipelines/test/deploy-pipeline.sh: line 50: pushd: ks_app: No such file or directory
++ ks param set pipeline apiImage gcr.io/ml-pipeline-test/0dbd45d062b4d7ce9a00605b2c71fe2b4d08f017/api:latest
level=error msg="finding app root from starting path: : unable to find ksonnet project"
++ ks param set pipeline persistenceAgentImage gcr.io/ml-pipeline-test/0dbd45d062b4d7ce9a00605b2c71fe2b4d08f017/persistenceagent:latest
level=error msg="finding app root from starting path: : unable to find ksonnet project"
++ ks param set pipeline scheduledWorkflowImage gcr.io/ml-pipeline-test/0dbd45d062b4d7ce9a00605b2c71fe2b4d08f017/scheduledworkflow:latest
level=error msg="finding app root from starting path: : unable to find ksonnet project"
++ ks param set pipeline uiImage gcr.io/ml-pipeline-test/0dbd45d062b4d7ce9a00605b2c71fe2b4d08f017/frontend:latest
level=error msg="finding app root from starting path: : unable to find ksonnet project"
++ ks apply default -c pipeline
level=error msg="finding app root from starting path: : unable to find ksonnet project"
++ popd

/home/prow/go/src/github.com/kubeflow/pipelines/test/deploy-kubeflow.sh: line 52: cd: e2e-85de728-27573: No such file or directory

+ source env.sh
/home/prow/go/src/github.com/kubeflow/pipelines/test/kubeflow_master/scripts/kfctl.sh: line 372: env.sh: No such file or directory
+ gcloud deployment-manager --project=ml-pipeline-test deployments delete e2e-85de728-27573-storage --quiet
Waiting for delete [operation-1551166440577-582c714ad9a2d-f99b0082-4fc5eb7f]...
........failed.
ERROR: (gcloud.deployment-manager.deployments.delete) Delete operation operation-1551166440577-582c714ad9a2d-f99b0082-4fc5eb7f failed.
Error in Operation [operation-1551166440577-582c714ad9a2d-f99b0082-4fc5eb7f]: errors:
- code: RESOURCE_ERROR
  location: /deployments/e2e-85de728-27573-storage/resources/e2e-85de728-27573-storage-metadata-store
  message: "{\"ResourceType\":\"compute.v1.disk\",\"ResourceErrorCode\":\"400\",\"\
    ResourceErrorMessage\":{\"code\":400,\"errors\":[{\"domain\":\"global\",\"message\"\
    :\"The disk resource 'projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-metadata-store'\
    \ is already being used by 'projects/ml-pipeline-test/zones/us-east1-b/instances/gke-e2e-85de728-27573-default-pool-a4318cd5-pqdh'\"\
    ,\"reason\":\"resourceInUseByAnotherResource\"}],\"message\":\"The disk resource\
    \ 'projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-metadata-store'\
    \ is already being used by 'projects/ml-pipeline-test/zones/us-east1-b/instances/gke-e2e-85de728-27573-default-pool-a4318cd5-pqdh'\"\
    ,\"statusMessage\":\"Bad Request\",\"requestPath\":\"https://www.googleapis.com/compute/v1/projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-metadata-store\"\
    ,\"httpMethod\":\"DELETE\"}}"
- code: RESOURCE_ERROR
  location: /deployments/e2e-85de728-27573-storage/resources/e2e-85de728-27573-storage-artifact-store
  message: "{\"ResourceType\":\"compute.v1.disk\",\"ResourceErrorCode\":\"400\",\"\
    ResourceErrorMessage\":{\"code\":400,\"errors\":[{\"domain\":\"global\",\"message\"\
    :\"The disk resource 'projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-artifact-store'\
    \ is already being used by 'projects/ml-pipeline-test/zones/us-east1-b/instances/gke-e2e-85de728-27573-default-pool-a4318cd5-27fb'\"\
    ,\"reason\":\"resourceInUseByAnotherResource\"}],\"message\":\"The disk resource\
    \ 'projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-artifact-store'\
    \ is already being used by 'projects/ml-pipeline-test/zones/us-east1-b/instances/gke-e2e-85de728-27573-default-pool-a4318cd5-27fb'\"\
    ,\"statusMessage\":\"Bad Request\",\"requestPath\":\"https://www.googleapis.com/compute/v1/projects/ml-pipeline-test/zones/us-east1-b/disks/e2e-85de728-27573-storage-artifact-store\"\
    ,\"httpMethod\":\"DELETE\"}}"

We should consider fail fast in case any error.

@vicaire
Copy link
Contributor

vicaire commented Mar 26, 2019

Can this be closed?

@IronPan
Copy link
Member Author

IronPan commented Mar 27, 2019

/close

@k8s-ci-robot
Copy link
Contributor

@IronPan: Closing this issue.

In response to this:

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

1 similar comment
@k8s-ci-robot
Copy link
Contributor

@IronPan: Closing this issue.

In response to this:

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Linchin pushed a commit to Linchin/pipelines that referenced this issue Apr 11, 2023
* Update optional-test-infra doc

* Move some contents back to root README

* Add doc for access control
magdalenakuhn17 pushed a commit to magdalenakuhn17/pipelines that referenced this issue Oct 22, 2023
* add tabular explainer e2e testcase

* Fix income explainer storage uri

* fix precision response
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants