New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i am unable to make run all my pods while installation #69672

Open
MohanNagendraKumar opened this Issue Oct 11, 2018 · 3 comments

Comments

Projects
None yet
3 participants
@MohanNagendraKumar

MohanNagendraKumar commented Oct 11, 2018

Is this a BUG REPORT or FEATURE REQUEST?:

Uncomment only one, leave it on its own line:

/kind bug
/kind feature

What happened:

What you expected to happen:

How to reproduce it (as minimally and precisely as possible):

Anything else we need to know?:

Environment:

  • Kubernetes version (use kubectl version):
  • Cloud provider or hardware configuration:
  • OS (e.g. from /etc/os-release):
  • Kernel (e.g. uname -a):
  • Install tools:
  • Others:
@k8s-ci-robot

This comment has been minimized.

Show comment
Hide comment
@k8s-ci-robot

k8s-ci-robot Oct 11, 2018

Contributor

@MohanNagendraKumar: There are no sig labels on this issue. Please add a sig label by either:

  1. mentioning a sig: @kubernetes/sig-<group-name>-<group-suffix>
    e.g., @kubernetes/sig-contributor-experience-<group-suffix> to notify the contributor experience sig, OR

  2. specifying the label manually: /sig <group-name>
    e.g., /sig scalability to apply the sig/scalability label

Note: Method 1 will trigger an email to the group. See the group list.
The <group-suffix> in method 1 has to be replaced with one of these: bugs, feature-requests, pr-reviews, test-failures, proposals.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Contributor

k8s-ci-robot commented Oct 11, 2018

@MohanNagendraKumar: There are no sig labels on this issue. Please add a sig label by either:

  1. mentioning a sig: @kubernetes/sig-<group-name>-<group-suffix>
    e.g., @kubernetes/sig-contributor-experience-<group-suffix> to notify the contributor experience sig, OR

  2. specifying the label manually: /sig <group-name>
    e.g., /sig scalability to apply the sig/scalability label

Note: Method 1 will trigger an email to the group. See the group list.
The <group-suffix> in method 1 has to be replaced with one of these: bugs, feature-requests, pr-reviews, test-failures, proposals.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

@MohanNagendraKumar

This comment has been minimized.

Show comment
Hide comment
@MohanNagendraKumar

MohanNagendraKumar Oct 11, 2018

NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system coredns-576cbf47c7-7r2rm 0/1 ContainerCreating 0 45m
kube-system coredns-576cbf47c7-h9qf7 0/1 ContainerCreating 0 45m
kube-system etcd-cssosbe03-b02-master 1/1 Running 0 12m
kube-system kube-apiserver-cssosbe03-b02-master 1/1 Running 0 16m
kube-system kube-controller-manager-cssosbe03-b02-master 1/1 Running 0 16m
kube-system kube-flannel-ds-amd64-9fbxx 0/1 CrashLoopBackOff 8 16m
kube-system kube-proxy-2gg4t 1/1 Running 0 45m
kube-system kube-scheduler-cssosbe03-b02-master 1/1 Running 0 16m

MohanNagendraKumar commented Oct 11, 2018

NAMESPACE NAME READY STATUS RESTARTS AGE
kube-system coredns-576cbf47c7-7r2rm 0/1 ContainerCreating 0 45m
kube-system coredns-576cbf47c7-h9qf7 0/1 ContainerCreating 0 45m
kube-system etcd-cssosbe03-b02-master 1/1 Running 0 12m
kube-system kube-apiserver-cssosbe03-b02-master 1/1 Running 0 16m
kube-system kube-controller-manager-cssosbe03-b02-master 1/1 Running 0 16m
kube-system kube-flannel-ds-amd64-9fbxx 0/1 CrashLoopBackOff 8 16m
kube-system kube-proxy-2gg4t 1/1 Running 0 45m
kube-system kube-scheduler-cssosbe03-b02-master 1/1 Running 0 16m

@chrisohaver

This comment has been minimized.

Show comment
Hide comment
@chrisohaver

chrisohaver Oct 11, 2018

Contributor

kube-system kube-flannel-ds-amd64-9fbxx 0/1 CrashLoopBackOff 8 16m

What do the logs say there? kubectl -n kube-system logs kube-flannel-ds-amd64-9fbxx

Contributor

chrisohaver commented Oct 11, 2018

kube-system kube-flannel-ds-amd64-9fbxx 0/1 CrashLoopBackOff 8 16m

What do the logs say there? kubectl -n kube-system logs kube-flannel-ds-amd64-9fbxx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment