Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kubernetes-e2e-gce-multizone: broken test run #34912

Closed
k8s-github-robot opened this issue Oct 16, 2016 · 18 comments
Closed

kubernetes-e2e-gce-multizone: broken test run #34912

k8s-github-robot opened this issue Oct 16, 2016 · 18 comments
Labels
kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence.
Milestone

Comments

@k8s-github-robot
Copy link

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4442/

Run so broken it didn't make JUnit output!

@k8s-github-robot k8s-github-robot added kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence. area/test-infra labels Oct 16, 2016
@spxtr spxtr assigned ghost and unassigned spxtr Oct 17, 2016
@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4479/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4483/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4554/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4560/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4601/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4753/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4777/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4822/

Multiple broken tests:

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl create quota should reject quota with invalid scopes {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:30:36.903: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc4204fa278), (*api.Node)(0xc4204fa4f0), (*api.Node)(0xc4204fa768), (*api.Node)(0xc4204fa9e0), (*api.Node)(0xc4204fac58), (*api.Node)(0xc4204faed0), (*api.Node)(0xc4204fb148), (*api.Node)(0xc4204fb3c0), (*api.Node)(0xc4204fb638)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Failed: [k8s.io] Proxy version v1 should proxy to cadvisor using proxy subresource {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:05:51.939: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420642de0), (*api.Node)(0xc420643058), (*api.Node)(0xc4206432d0), (*api.Node)(0xc420643548), (*api.Node)(0xc4206437c0), (*api.Node)(0xc420643a38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Failed: [k8s.io] Job should scale a job up {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:137
Expected error:
    <*errors.errorString | 0xc4203aced0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:136

Issues about this test specifically: #29511 #29987 #30238

Failed: [k8s.io] KubeletManagedEtcHosts should test kubelet managed /etc/hosts file {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/kubelet_etc_hosts.go:54
Expected error:
    <*errors.errorString | 0xc4203d0f00>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/pods.go:66

Issues about this test specifically: #27023 #34604

Failed: [k8s.io] ReplicaSet should serve a basic image on each replica with a public image [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/replica_set.go:40
Expected error:
    <*errors.errorString | 0xc4203d0f00>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/replica_set.go:109

Issues about this test specifically: #30981

Failed: [k8s.io] ConfigMap should be consumable from pods in volume with mappings [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/configmap.go:54
Expected error:
    <*errors.errorString | 0xc420a94dd0>: {
        s: "expected pod \"pod-configmaps-2cdd1aeb-9cde-11e6-9e46-0242ac110002\" success: gave up waiting for pod 'pod-configmaps-2cdd1aeb-9cde-11e6-9e46-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-configmaps-2cdd1aeb-9cde-11e6-9e46-0242ac110002" success: gave up waiting for pod 'pod-configmaps-2cdd1aeb-9cde-11e6-9e46-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #32949

Failed: [k8s.io] EmptyDir volumes should support (non-root,0777,tmpfs) [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/empty_dir.go:89
Expected error:
    <*errors.errorString | 0xc420addef0>: {
        s: "expected pod \"pod-90e42b83-9cdc-11e6-87c5-0242ac110002\" success: gave up waiting for pod 'pod-90e42b83-9cdc-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-90e42b83-9cdc-11e6-87c5-0242ac110002" success: gave up waiting for pod 'pod-90e42b83-9cdc-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #30851

Failed: [k8s.io] Services should be able to up and down services {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:353
Expected error:
    <*errors.errorString | 0xc420f5eb90>: {
        s: "Only 0 pods started out of 3",
    }
    Only 0 pods started out of 3
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:312

Issues about this test specifically: #26128 #26685 #33408

Failed: [k8s.io] Deployment RollingUpdateDeployment should scale up and down in the right order {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:67
Expected error:
    <*errors.errorString | 0xc4206ee010>: {
        s: "failed to wait for pods running: [timed out waiting for the condition]",
    }
    failed to wait for pods running: [timed out waiting for the condition]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:391

Issues about this test specifically: #27232

Failed: [k8s.io] DisruptionController should create a PodDisruptionBudget {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:31:35.057: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc4204b6678), (*api.Node)(0xc4204b68f0), (*api.Node)(0xc4204b6b68), (*api.Node)(0xc4204b6de0), (*api.Node)(0xc4204b7058), (*api.Node)(0xc4204b72d0), (*api.Node)(0xc4204b7548), (*api.Node)(0xc4204b77c0), (*api.Node)(0xc4204b7a38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Failed: [k8s.io] Downward API should provide pod name and namespace as env vars [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/downward_api.go:62
Expected error:
    <*errors.errorString | 0xc42096edf0>: {
        s: "expected pod \"downward-api-5c136ecc-9ce8-11e6-afd4-0242ac110002\" success: gave up waiting for pod 'downward-api-5c136ecc-9ce8-11e6-afd4-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "downward-api-5c136ecc-9ce8-11e6-afd4-0242ac110002" success: gave up waiting for pod 'downward-api-5c136ecc-9ce8-11e6-afd4-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Services should serve a basic endpoint from pods [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:166
Oct 28 00:03:58.325: Timed out waiting for service endpoint-test2 in namespace e2e-tests-services-t6zdo to expose endpoints map[pod1:[80]] (1m0s elapsed)
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service.go:1533

Issues about this test specifically: #26678 #29318

Failed: [k8s.io] Pods should support remote command execution over websockets {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/pods.go:507
Expected error:
    <*errors.errorString | 0xc4203ad7b0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/pods.go:66

Failed: [k8s.io] EmptyDir volumes should support (non-root,0666,default) [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/empty_dir.go:113
Expected error:
    <*errors.errorString | 0xc4209f4080>: {
        s: "expected pod \"pod-0b2f47f1-9cdb-11e6-b533-0242ac110002\" success: gave up waiting for pod 'pod-0b2f47f1-9cdb-11e6-b533-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-0b2f47f1-9cdb-11e6-b533-0242ac110002" success: gave up waiting for pod 'pod-0b2f47f1-9cdb-11e6-b533-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #34226

Failed: [k8s.io] Kubectl alpha client [k8s.io] Kubectl run ScheduledJob should create a ScheduledJob {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:51:39.026: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc4204d2678), (*api.Node)(0xc4204d28f0), (*api.Node)(0xc4204d2de0), (*api.Node)(0xc4204d3058), (*api.Node)(0xc4204d32d0), (*api.Node)(0xc4204d3548), (*api.Node)(0xc4204d37c0), (*api.Node)(0xc4204d3a38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Failed: [k8s.io] Deployment deployment should support rollover {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:76
Expected error:
    <*errors.errorString | 0xc420bd4000>: {
        s: "failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]",
    }
    failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:567

Issues about this test specifically: #26509 #26834 #29780 #35355

Failed: [k8s.io] Secrets should be consumable from pods in volume with mappings [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/secrets.go:44
Expected error:
    <*errors.errorString | 0xc420697940>: {
        s: "expected pod \"pod-secrets-25c1c5be-9ce8-11e6-aa24-0242ac110002\" success: gave up waiting for pod 'pod-secrets-25c1c5be-9ce8-11e6-aa24-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-secrets-25c1c5be-9ce8-11e6-aa24-0242ac110002" success: gave up waiting for pod 'pod-secrets-25c1c5be-9ce8-11e6-aa24-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Downward API volume should provide container's cpu limit {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/downwardapi_volume.go:147
Expected error:
    <*errors.errorString | 0xc4209d7530>: {
        s: "expected pod \"downwardapi-volume-baeb815a-9ce6-11e6-a745-0242ac110002\" success: gave up waiting for pod 'downwardapi-volume-baeb815a-9ce6-11e6-a745-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "downwardapi-volume-baeb815a-9ce6-11e6-a745-0242ac110002" success: gave up waiting for pod 'downwardapi-volume-baeb815a-9ce6-11e6-a745-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Deployment overlapping deployment should not fight with each other {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:02:12.261: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420b90d78), (*api.Node)(0xc420b914e0), (*api.Node)(0xc420b919d0), (*api.Node)(0xc420b91c48), (*api.Node)(0xc420b91ec0), (*api.Node)(0xc420b92138)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #31502 #32947

Failed: [k8s.io] V1Job should delete a job {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/batch_v1_jobs.go:189
Expected error:
    <*errors.errorString | 0xc4203a4bc0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/batch_v1_jobs.go:176

Failed: [k8s.io] Services should prevent NodePort collisions {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 01:15:42.232: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc42149e678), (*api.Node)(0xc42149e8f0), (*api.Node)(0xc42149eb68), (*api.Node)(0xc42149ede0), (*api.Node)(0xc42149f058), (*api.Node)(0xc42149f2d0), (*api.Node)(0xc42149f548), (*api.Node)(0xc42149f7c0), (*api.Node)(0xc42149fa38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #31575 #32756

Failed: [k8s.io] Docker Containers should use the image defaults if command and args are blank [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/docker_containers.go:34
Expected error:
    <*errors.errorString | 0xc420b8b340>: {
        s: "expected pod \"client-containers-0837f60c-9cdf-11e6-98b4-0242ac110002\" success: gave up waiting for pod 'client-containers-0837f60c-9cdf-11e6-98b4-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "client-containers-0837f60c-9cdf-11e6-98b4-0242ac110002" success: gave up waiting for pod 'client-containers-0837f60c-9cdf-11e6-98b4-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #34520

Failed: [k8s.io] Job should run a job to completion when tasks succeed {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:59
Expected error:
    <*errors.errorString | 0xc420430e80>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:58

Issues about this test specifically: #31938

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl run rc should create an rc from an image [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:29:51.694: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420e5e278), (*api.Node)(0xc420e5e4f0), (*api.Node)(0xc420e5e768), (*api.Node)(0xc420e5e9e0), (*api.Node)(0xc420e5ec58), (*api.Node)(0xc420e5eed0), (*api.Node)(0xc420e5f148), (*api.Node)(0xc420e5f3c0), (*api.Node)(0xc420e5f638)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #28507 #29315 #35595

Failed: [k8s.io] ConfigMap should be consumable in multiple volumes in the same pod {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/configmap.go:264
Expected error:
    <*errors.errorString | 0xc420a52020>: {
        s: "expected pod \"pod-configmaps-825fc470-9ce4-11e6-87c5-0242ac110002\" success: gave up waiting for pod 'pod-configmaps-825fc470-9ce4-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-configmaps-825fc470-9ce4-11e6-87c5-0242ac110002" success: gave up waiting for pod 'pod-configmaps-825fc470-9ce4-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #29751 #30430

Failed: [k8s.io] ReplicationController should serve a basic image on each replica with a public image [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/rc.go:38
Expected error:
    <*errors.errorString | 0xc4203d0f00>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/rc.go:108

Issues about this test specifically: #26870

Failed: [k8s.io] ServiceAccounts should mount an API token into pods [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/service_accounts.go:240
Expected error:
    <*errors.errorString | 0xc420f35610>: {
        s: "expected pod \"\" success: gave up waiting for pod 'pod-service-account-912ec59f-9cdc-11e6-8e8b-0242ac110002-afpho' to be 'success or failure' after 5m0s",
    }
    expected pod "" success: gave up waiting for pod 'pod-service-account-912ec59f-9cdc-11e6-8e8b-0242ac110002-afpho' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl version should check is all data is printed [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:21:49.718: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc4204ba4f0), (*api.Node)(0xc4204ba768), (*api.Node)(0xc4204ba9e0), (*api.Node)(0xc4204bac58), (*api.Node)(0xc4204baed0), (*api.Node)(0xc4204bb148), (*api.Node)(0xc4204bb3c0), (*api.Node)(0xc4204bb638)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #29050

Failed: [k8s.io] Probing container should not be restarted with a exec "cat /tmp/health" liveness probe [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/container_probe.go:148
starting pod liveness-exec in namespace e2e-tests-container-probe-dguh7
Expected error:
    <*errors.errorString | 0xc4203c3840>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/container_probe.go:335

Failed: [k8s.io] ResourceQuota should verify ResourceQuota with terminating scopes. {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:36:55.331: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420971b78), (*api.Node)(0xc420971df0), (*api.Node)(0xc420972068), (*api.Node)(0xc4209722e0), (*api.Node)(0xc420972558), (*api.Node)(0xc4209727d0), (*api.Node)(0xc420972a48), (*api.Node)(0xc420972cc0), (*api.Node)(0xc420972f38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #31158 #34303

Failed: [k8s.io] Job should run a job to completion when tasks sometimes fail and are locally restarted {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:78
Expected error:
    <*errors.errorString | 0xc42043f2b0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/job.go:77

Failed: [k8s.io] Secrets should be consumable from pods in volume with mappings and Item Mode set [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/secrets.go:49
Expected error:
    <*errors.errorString | 0xc420a7e0d0>: {
        s: "expected pod \"pod-secrets-01ee9a5b-9ce7-11e6-8e8b-0242ac110002\" success: gave up waiting for pod 'pod-secrets-01ee9a5b-9ce7-11e6-8e8b-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-secrets-01ee9a5b-9ce7-11e6-8e8b-0242ac110002" success: gave up waiting for pod 'pod-secrets-01ee9a5b-9ce7-11e6-8e8b-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Sysctls should reject invalid sysctls {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:47:53.634: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420a8f478), (*api.Node)(0xc420a8f6f0), (*api.Node)(0xc420a8fbe0), (*api.Node)(0xc420a8fe58), (*api.Node)(0xc420a900d0), (*api.Node)(0xc420a90348), (*api.Node)(0xc420a905c0), (*api.Node)(0xc420a90838)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl apply should apply a new configuration to an existing RC {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:21:51.042: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420c4d6f0), (*api.Node)(0xc420c4d968), (*api.Node)(0xc420c4dbe0), (*api.Node)(0xc420c4de58), (*api.Node)(0xc420c4e0d0), (*api.Node)(0xc420c4e348), (*api.Node)(0xc420c4e5c0), (*api.Node)(0xc420c4e838)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #27524 #32057

Failed: [k8s.io] DisruptionController should update PodDisruptionBudget status {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/disruption.go:77
Waiting for pods in namespace "e2e-tests-disruption-uyv1y" to be ready
Expected error:
    <*errors.errorString | 0xc4203a4bc0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/disruption.go:248

Issues about this test specifically: #34119

Failed: [k8s.io] EmptyDir volumes should support (non-root,0644,tmpfs) [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/empty_dir.go:81
Expected error:
    <*errors.errorString | 0xc420d3e6b0>: {
        s: "expected pod \"pod-033122ce-9ce1-11e6-883c-0242ac110002\" success: gave up waiting for pod 'pod-033122ce-9ce1-11e6-883c-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-033122ce-9ce1-11e6-883c-0242ac110002" success: gave up waiting for pod 'pod-033122ce-9ce1-11e6-883c-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #29224 #32008

Failed: [k8s.io] Multi-AZ Clusters should spread the pods of a replication controller across zones {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/ubernetes_lite.go:57
Pods were not evenly spread across zones.  0 in one zone and 4 in another zone
Expected
    <int>: 0
to be ~
    <int>: 4
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/ubernetes_lite.go:185

Issues about this test specifically: #34247

Failed: [k8s.io] Deployment deployment should label adopted RSs and pods {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:88
Expected error:
    <*errors.errorString | 0xc420a6c000>: {
        s: "failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]",
    }
    failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:966

Issues about this test specifically: #29629

Failed: [k8s.io] Sysctls should not launch unsafe, but not explicitly enabled sysctls on the node {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/sysctl.go:232
Expected error:
    <*errors.errorString | 0xc4203d1080>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/sysctl.go:224

Failed: [k8s.io] Downward API volume should set mode on item file [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/downwardapi_volume.go:68
Expected error:
    <*errors.errorString | 0xc420ceb350>: {
        s: "expected pod \"downwardapi-volume-f4f28620-9cda-11e6-a51a-0242ac110002\" success: pod \"downwardapi-volume-f4f28620-9cda-11e6-a51a-0242ac110002\" failed with status: {Phase:Failed Conditions:[{Type:Initialized Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:35 -0700 PDT Reason: Message:} {Type:Ready Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:35 -0700 PDT Reason:ContainersNotReady Message:containers with unready status: [client-container]} {Type:PodScheduled Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:33 -0700 PDT Reason: Message:}] Message: Reason: HostIP:10.240.0.6 PodIP: StartTime:2016-10-27 23:51:35 -0700 PDT InitContainerStatuses:[] ContainerStatuses:[{Name:client-container State:{Waiting:<nil> Running:<nil> Terminated:0xc420f2f880} LastTerminationState:{Waiting:<nil> Running:<nil> Terminated:<nil>} Ready:false RestartCount:0 Image:gcr.io/google_containers/mounttest:0.7 ImageID:docker://sha256:7d5bba82296e1fb44be5a07e3965659130433a2a63c9175f59966fb18c4d6029 ContainerID:docker://572efd8c2665b27273de62939a75ce3916e05404b435eb22639919eb51ea5b09}]}",
    }
    expected pod "downwardapi-volume-f4f28620-9cda-11e6-a51a-0242ac110002" success: pod "downwardapi-volume-f4f28620-9cda-11e6-a51a-0242ac110002" failed with status: {Phase:Failed Conditions:[{Type:Initialized Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:35 -0700 PDT Reason: Message:} {Type:Ready Status:False LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:35 -0700 PDT Reason:ContainersNotReady Message:containers with unready status: [client-container]} {Type:PodScheduled Status:True LastProbeTime:0001-01-01 00:00:00 +0000 UTC LastTransitionTime:2016-10-27 23:51:33 -0700 PDT Reason: Message:}] Message: Reason: HostIP:10.240.0.6 PodIP: StartTime:2016-10-27 23:51:35 -0700 PDT InitContainerStatuses:[] ContainerStatuses:[{Name:client-container State:{Waiting:<nil> Running:<nil> Terminated:0xc420f2f880} LastTerminationState:{Waiting:<nil> Running:<nil> Terminated:<nil>} Ready:false RestartCount:0 Image:gcr.io/google_containers/mounttest:0.7 ImageID:docker://sha256:7d5bba82296e1fb44be5a07e3965659130433a2a63c9175f59966fb18c4d6029 ContainerID:docker://572efd8c2665b27273de62939a75ce3916e05404b435eb22639919eb51ea5b09}]}
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Probing container should be restarted with a /healthz http liveness probe [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/container_probe.go:176
Oct 27 23:53:47.057: pod e2e-tests-container-probe-2nhwl/liveness-http - expected number of restarts: 1, found restarts: 0
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/container_probe.go:374

Failed: [k8s.io] ConfigMap should be consumable from pods in volume with mappings as non-root [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/configmap.go:63
Expected error:
    <*errors.errorString | 0xc420add3c0>: {
        s: "expected pod \"pod-configmaps-4048dcde-9cdf-11e6-87c5-0242ac110002\" success: gave up waiting for pod 'pod-configmaps-4048dcde-9cdf-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-configmaps-4048dcde-9cdf-11e6-87c5-0242ac110002" success: gave up waiting for pod 'pod-configmaps-4048dcde-9cdf-11e6-87c5-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] Proxy version v1 should proxy through a service and a pod [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/proxy.go:272
Expected error:
    <*errors.errorString | 0xc420f0da40>: {
        s: "Only 0 pods started out of 1",
    }
    Only 0 pods started out of 1
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/proxy.go:148

Issues about this test specifically: #26164 #26210 #33998

Failed: [k8s.io] Deployment RecreateDeployment should delete old pods and create new ones {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:70
Expected error:
    <*errors.errorString | 0xc420e70000>: {
        s: "failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]",
    }
    failed to wait for pods running: [timed out waiting for the condition timed out waiting for the condition timed out waiting for the condition]
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:443

Issues about this test specifically: #29197

Failed: [k8s.io] Docker Containers should be able to override the image's default arguments (docker cmd) [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/docker_containers.go:43
Expected error:
    <*errors.errorString | 0xc420ba0510>: {
        s: "expected pod \"client-containers-15fe99ca-9cdc-11e6-883c-0242ac110002\" success: gave up waiting for pod 'client-containers-15fe99ca-9cdc-11e6-883c-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "client-containers-15fe99ca-9cdc-11e6-883c-0242ac110002" success: gave up waiting for pod 'client-containers-15fe99ca-9cdc-11e6-883c-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Failed: [k8s.io] DNS should provide DNS for pods for Hostname and Subdomain Annotation {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/dns.go:437
Expected error:
    <*errors.errorString | 0xc4203745d0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/dns.go:236

Issues about this test specifically: #28337

Failed: [k8s.io] ConfigMap should be consumable from pods in volume as non-root [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/configmap.go:46
Expected error:
    <*errors.errorString | 0xc420da52f0>: {
        s: "expected pod \"pod-configmaps-16cb3bb4-9cdb-11e6-80a3-0242ac110002\" success: gave up waiting for pod 'pod-configmaps-16cb3bb4-9cdb-11e6-80a3-0242ac110002' to be 'success or failure' after 5m0s",
    }
    expected pod "pod-configmaps-16cb3bb4-9cdb-11e6-80a3-0242ac110002" success: gave up waiting for pod 'pod-configmaps-16cb3bb4-9cdb-11e6-80a3-0242ac110002' to be 'success or failure' after 5m0s
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/util.go:2128

Issues about this test specifically: #27245

Failed: [k8s.io] Downward API volume should update labels on modification [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/downwardapi_volume.go:109
Expected error:
    <*errors.errorString | 0xc4203d0f00>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/pods.go:66

Issues about this test specifically: #28416 #31055 #33627 #33725 #34206

Failed: [k8s.io] DisruptionController evictions: no PDB => should allow an eviction {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/disruption.go:178
Expected error:
    <*errors.errorString | 0xc4203accb0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/disruption.go:149

Issues about this test specifically: #32646

Failed: [k8s.io] Docker Containers should be able to override the image's default command and arguments [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:14:21.100: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420974d78), (*api.Node)(0xc420974ff0), (*api.Node)(0xc420975268), (*api.Node)(0xc4209754e0), (*api.Node)(0xc420975758), (*api.Node)(0xc4209759d0), (*api.Node)(0xc420975c48), (*api.Node)(0xc420975ec0), (*api.Node)(0xc420976138)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #29467

Failed: [k8s.io] Deployment deployment should support rollback {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:34:53.526: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420ae7b78), (*api.Node)(0xc420ae7df0), (*api.Node)(0xc420ae82e0), (*api.Node)(0xc420ae87d0), (*api.Node)(0xc420ae8a48), (*api.Node)(0xc420ae8cc0), (*api.Node)(0xc420ae8f38)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #28348

Failed: [k8s.io] Job should fail a job {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:10:09.150: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420cb0d78), (*api.Node)(0xc420cb0ff0), (*api.Node)(0xc420cb14e0), (*api.Node)(0xc420cb1758), (*api.Node)(0xc420cb19d0), (*api.Node)(0xc420cb1c48), (*api.Node)(0xc420cb1ec0), (*api.Node)(0xc420cb2138)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #28773 #29506 #30699 #32734 #34585

Failed: [k8s.io] ScheduledJob should schedule multiple jobs concurrently {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 27 23:57:08.213: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc420e1ad78), (*api.Node)(0xc420e1b4e0), (*api.Node)(0xc420e1b9d0), (*api.Node)(0xc420e1bc48), (*api.Node)(0xc420e1bec0), (*api.Node)(0xc420e1c138)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #31657

Failed: [k8s.io] Pods should get a host IP [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/common/pods.go:143
Expected error:
    <*errors.errorString | 0xc4203accb0>: {
        s: "timed out waiting for the condition",
    }
    timed out waiting for the condition
not to have occurred
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/pods.go:66

Issues about this test specifically: #33008

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl apply should reuse nodePort when apply to an existing SVC {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:142
Oct 28 00:23:24.490: All nodes should be ready after test, Not ready nodes: []*api.Node{(*api.Node)(0xc4205e04f0), (*api.Node)(0xc4205e0768), (*api.Node)(0xc4205e09e0), (*api.Node)(0xc4205e0c58), (*api.Node)(0xc4205e0ed0), (*api.Node)(0xc4205e1148), (*api.Node)(0xc4205e13c0), (*api.Node)(0xc4205e1638)}
/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/framework/framework.go:438

Issues about this test specifically: #28523 #35741

Failed: [k8s.io] Kubectl client [k8s.io] Kubectl run --rm job should create a job from an image, then delete the job [Conformance] {Kubernetes e2e suite}

/go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/kubectl.go:1242
Expected error:
    <*errors.errorString | 0xc420ba21a0>: {
        s: "timed out waiting for command &{/workspace/kubernetes/platforms/linux/amd64/kubectl [kubectl --server=https://8.34.213.128 --kubeconfig=/workspace/.kube/config --namespace=e2e-tests-kubectl-m8nwd run e2e-test-rm-busybox-job --image=gcr.io/google_containers/busybox:1.24 --rm=true --generator=job/v1 --restart=OnFailure --attach=true --stdin -- sh -c cat && echo 'stdin closed'] []  0xc4209ef0c0 Waiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\n  [] <nil> 0xc420367410 <nil> <nil> <nil> true [0xc4211380a8 0xc4211380d0 0xc4211380e0] [0xc4211380a8 0xc4211380d0 0xc4211380e0] [0xc4211380b0 0xc4211380c8 0xc4211380d8] [0x921dd0 0x921ed0 0x921ed0] 0xc420d4dda0 <nil>}:\nCommand stdout:\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, status is Pending, pod ready: false\nWaiting for pod e2e-tests-kubectl-m8nwd/e2e-test-rm-busybox-job-8eavy to be running, sta

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4901/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/4927/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5007/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5070/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5099/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5115/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5129/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5210/

Run so broken it didn't make JUnit output!

@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5247/

Run so broken it didn't make JUnit output!

@ghost ghost removed their assignment Nov 11, 2016
@k8s-github-robot
Copy link
Author

Failed: https://k8s-gubernator.appspot.com/build/kubernetes-jenkins/logs/kubernetes-e2e-gce-multizone/5599/

Run so broken it didn't make JUnit output!

@calebamiles calebamiles modified the milestone: v1.6 Mar 3, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/flake Categorizes issue or PR as related to a flaky test. priority/backlog Higher priority than priority/awaiting-more-evidence.
Projects
None yet
Development

No branches or pull requests

5 participants