Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Sep 12 17:59:09 k8-app-0.example.com kubelet[1293]: W0912 17:59:09.738204 1293 qos_container_manager_linux.go:134] [ContainerManager] Failed to reserve QoS requests: failed to set supported cgroup subsystems for cgroup /kubepods/burstable: Failed to set config for supported subsystems : failed to write 2 to cpu.shares: open /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/burstable/cpu.shares: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.581461 1293 pod_container_deletor.go:77] Container "86e3e4a4179ff830982ea2a067f6db729c088a897490ab0d6f3b4f628a29b123" not found in pod's containers
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: I0912 17:59:30.586073 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.689213 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.699718 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.701575 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.703351 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: I0912 17:59:30.759542 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.885261 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.886281 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.887019 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c: no such file or directory
- Sep 12 17:59:30 k8-app-0.example.com kubelet[1293]: W0912 17:59:30.887724 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/6175603d079bf86c9981760c8e196c673df61b8467c38e2a537cb3cc52584d3c: no such file or directory
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: W0912 17:59:32.603449 1293 pod_container_deletor.go:77] Container "a99548e22680273514089108557fe856fc30b58ada5140119f4c4b55d4acc168" not found in pod's containers
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: I0912 17:59:32.620153 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: W0912 17:59:32.743412 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027: no such file or directory
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: W0912 17:59:32.747187 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027: no such file or directory
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: W0912 17:59:32.762506 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027: no such file or directory
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: W0912 17:59:32.764490 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027: no such file or directory
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: I0912 17:59:32.814121 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: I0912 17:59:32.814544 1293 kuberuntime_manager.go:751] Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:32 k8-app-0.example.com kubelet[1293]: E0912 17:59:32.814851 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:33 k8-app-0.example.com kubelet[1293]: I0912 17:59:33.637231 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:33 k8-app-0.example.com kubelet[1293]: I0912 17:59:33.638556 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:33 k8-app-0.example.com kubelet[1293]: I0912 17:59:33.638851 1293 kuberuntime_manager.go:751] Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:33 k8-app-0.example.com kubelet[1293]: E0912 17:59:33.639060 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:34 k8-app-0.example.com kubelet[1293]: I0912 17:59:34.640450 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:34 k8-app-0.example.com kubelet[1293]: I0912 17:59:34.641872 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:34 k8-app-0.example.com kubelet[1293]: I0912 17:59:34.642179 1293 kuberuntime_manager.go:751] Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:34 k8-app-0.example.com kubelet[1293]: E0912 17:59:34.642411 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 10s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: I0912 17:59:49.030946 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: I0912 17:59:49.032377 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: W0912 17:59:49.152273 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f: no such file or directory
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: W0912 17:59:49.162605 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f: no such file or directory
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: W0912 17:59:49.164861 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f: no such file or directory
- Sep 12 17:59:49 k8-app-0.example.com kubelet[1293]: W0912 17:59:49.166220 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/adb3fa1a53f62915cc14c94dfebf8316582ad374292858665845e98513956c0f: no such file or directory
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: W0912 17:59:51.709397 1293 pod_container_deletor.go:77] Container "b41c2e71a40a7c523d227342c9bd731d8170549ad6c764333147fc0319dbe027" not found in pod's containers
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: I0912 17:59:51.713831 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: W0912 17:59:51.852159 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563: no such file or directory
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: W0912 17:59:51.854357 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563: no such file or directory
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: W0912 17:59:51.857473 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563: no such file or directory
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: W0912 17:59:51.867757 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd0320e82-977f-11e7-9c7c-02334e72d0a8/c42b0b66069ea8244dc3999d53837833cfcda8fcd0b304150fce49ff663d0563: no such file or directory
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: I0912 17:59:51.917960 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: I0912 17:59:51.918379 1293 kuberuntime_manager.go:751] Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:51 k8-app-0.example.com kubelet[1293]: E0912 17:59:51.918617 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:52 k8-app-0.example.com kubelet[1293]: I0912 17:59:52.732223 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:52 k8-app-0.example.com kubelet[1293]: I0912 17:59:52.733532 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:52 k8-app-0.example.com kubelet[1293]: I0912 17:59:52.733833 1293 kuberuntime_manager.go:751] Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:52 k8-app-0.example.com kubelet[1293]: E0912 17:59:52.734040 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:53 k8-app-0.example.com kubelet[1293]: I0912 17:59:53.743925 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:53 k8-app-0.example.com kubelet[1293]: I0912 17:59:53.745239 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:53 k8-app-0.example.com kubelet[1293]: I0912 17:59:53.745523 1293 kuberuntime_manager.go:751] Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 17:59:53 k8-app-0.example.com kubelet[1293]: E0912 17:59:53.745730 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 17:59:59 k8-app-0.example.com kubelet[1293]: W0912 17:59:59.778247 1293 pod_container_deletor.go:77] Container "eb76ec336f2e7012ce07b7422df1c64c49a30bba76c3a74532ac8baee3469d17" not found in pod's containers
- Sep 12 17:59:59 k8-app-0.example.com kubelet[1293]: I0912 17:59:59.790924 1293 kuberuntime_manager.go:457] Container {Name:nginx-server Image:nginx Command:[] Args:[] WorkingDir: Ports:[{Name:http HostPort:0 ContainerPort:80 Protocol:TCP HostIP:}] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:POD_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:&Probe{Handler:Handler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:80,Host:,Scheme:HTTP,HTTPHeaders:[],},TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,} ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:Always SecurityContext:nil Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 17:59:59 k8-app-0.example.com kubelet[1293]: I0912 17:59:59.792148 1293 kuberuntime_manager.go:457] Container {Name:regup Image:registry.hub.docker.com/spunon/regup:0.1.0 Command:[] Args:[] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:SERVICE_NAME Value:nginx ValueFrom:nil} {Name:SERVICE_PORT Value:80 ValueFrom:nil} {Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:POD_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:Always SecurityContext:nil Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 18:00:00 k8-app-0.example.com kubelet[1293]: W0912 18:00:00.013086 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57: no such file or directory
- Sep 12 18:00:00 k8-app-0.example.com kubelet[1293]: W0912 18:00:00.014223 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57: no such file or directory
- Sep 12 18:00:00 k8-app-0.example.com kubelet[1293]: W0912 18:00:00.014955 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57: no such file or directory
- Sep 12 18:00:00 k8-app-0.example.com kubelet[1293]: W0912 18:00:00.015648 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57: no such file or directory
- Sep 12 18:00:00 k8-app-0.example.com kubelet[1293]: I0912 18:00:00.146862 1293 kuberuntime_manager.go:741] checking backoff for container "regup" in pod "nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:01 k8-app-0.example.com kubelet[1293]: W0912 18:00:01.331054 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d: no such file or directory
- Sep 12 18:00:01 k8-app-0.example.com kubelet[1293]: W0912 18:00:01.356531 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d: no such file or directory
- Sep 12 18:00:01 k8-app-0.example.com kubelet[1293]: W0912 18:00:01.357374 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d: no such file or directory
- Sep 12 18:00:01 k8-app-0.example.com kubelet[1293]: W0912 18:00:01.358480 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/461d02116c0d9034293ca6b1079ddf50e32aa495ed8abb48501144c5168d235d: no such file or directory
- Sep 12 18:00:01 k8-app-0.example.com kubelet[1293]: I0912 18:00:01.416999 1293 kuberuntime_manager.go:741] checking backoff for container "nginx-server" in pod "nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:02 k8-app-0.example.com kubelet[1293]: W0912 18:00:02.560652 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b: no such file or directory
- Sep 12 18:00:02 k8-app-0.example.com kubelet[1293]: W0912 18:00:02.562758 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b: no such file or directory
- Sep 12 18:00:02 k8-app-0.example.com kubelet[1293]: W0912 18:00:02.565995 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b: no such file or directory
- Sep 12 18:00:02 k8-app-0.example.com kubelet[1293]: W0912 18:00:02.576443 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/91a69545f066e07d3a85c5fd2707e710557f6b7bd16d0439fdbf03ecc542cc3b: no such file or directory
- Sep 12 18:00:05 k8-app-0.example.com kubelet[1293]: I0912 18:00:05.030575 1293 kuberuntime_manager.go:457] Container {Name:consul Image:registry.hub.docker.com/library/consul:0.8.5 Command:[] Args:[agent -recursor=10.0.0.2 -dns-port=53 -advertise=$(NODE_IP) -client=0.0.0.0 -retry-join=10.0.1.50] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:CONSUL_ALLOW_PRIVILEGED_PORTS Value:true ValueFrom:nil} {Name:CONSUL_LOCAL_CONFIG Value:{"disable_update_check": true} ValueFrom:nil}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:IfNotPresent SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,} Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 18:00:05 k8-app-0.example.com kubelet[1293]: I0912 18:00:05.031950 1293 kuberuntime_manager.go:741] checking backoff for container "consul" in pod "consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:05 k8-app-0.example.com kubelet[1293]: I0912 18:00:05.032247 1293 kuberuntime_manager.go:751] Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 18:00:05 k8-app-0.example.com kubelet[1293]: E0912 18:00:05.032478 1293 pod_workers.go:182] Error syncing pod d0320e82-977f-11e7-9c7c-02334e72d0a8 ("consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"), skipping: failed to "StartContainer" for "consul" with CrashLoopBackOff: "Back-off 20s restarting failed container=consul pod=consul-client-2kz30_default(d0320e82-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:06 k8-app-0.example.com kubelet[1293]: W0912 18:00:06.893907 1293 pod_container_deletor.go:77] Container "63fd1c85aa6b48125f06ca73ec5c697386d9d1dad21a27715b0f757d1ad21c57" not found in pod's containers
- Sep 12 18:00:06 k8-app-0.example.com kubelet[1293]: I0912 18:00:06.926430 1293 kuberuntime_manager.go:457] Container {Name:nginx-server Image:nginx Command:[] Args:[] WorkingDir: Ports:[{Name:http HostPort:0 ContainerPort:80 Protocol:TCP HostIP:}] EnvFrom:[] Env:[{Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:POD_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:&Probe{Handler:Handler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:80,Host:,Scheme:HTTP,HTTPHeaders:[],},TCPSocket:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,} ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:Always SecurityContext:nil Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 18:00:06 k8-app-0.example.com kubelet[1293]: I0912 18:00:06.927573 1293 kuberuntime_manager.go:457] Container {Name:regup Image:registry.hub.docker.com/spunon/regup:0.1.0 Command:[] Args:[] WorkingDir: Ports:[] EnvFrom:[] Env:[{Name:SERVICE_NAME Value:nginx ValueFrom:nil} {Name:SERVICE_PORT Value:80 ValueFrom:nil} {Name:NODE_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.hostIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}} {Name:POD_IP Value: ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,}}] Resources:{Limits:map[] Requests:map[]} VolumeMounts:[] LivenessProbe:nil ReadinessProbe:nil Lifecycle:nil TerminationMessagePath:/dev/termination-log TerminationMessagePolicy:File ImagePullPolicy:Always SecurityContext:nil Stdin:false StdinOnce:false TTY:false} is dead, but RestartPolicy says that we should restart it.
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: W0912 18:00:07.091900 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpuset/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28: no such file or directory
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: W0912 18:00:07.105474 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/memory/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28: no such file or directory
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: W0912 18:00:07.108065 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/cpu,cpuacct/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28: no such file or directory
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: W0912 18:00:07.109703 1293 raw.go:87] Error while processing event ("/var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /var/lib/rkt/pods/run/e45e74b1-96d3-4acd-ac84-ccb869f35dc4/stage1/rootfs/opt/stage2/flannel/rootfs/sys/fs/cgroup/blkio/kubepods/besteffort/podd96ed995-977f-11e7-9c7c-02334e72d0a8/7c35656698c3fd16486d531261a06f432c1d63a6fd87963c88760388cca07e28: no such file or directory
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: I0912 18:00:07.228206 1293 kuberuntime_manager.go:741] checking backoff for container "nginx-server" in pod "nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: I0912 18:00:07.228661 1293 kuberuntime_manager.go:751] Back-off 10s restarting failed container=nginx-server pod=nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: I0912 18:00:07.228900 1293 kuberuntime_manager.go:741] checking backoff for container "regup" in pod "nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: I0912 18:00:07.229162 1293 kuberuntime_manager.go:751] Back-off 10s restarting failed container=regup pod=nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: E0912 18:00:07.229392 1293 pod_workers.go:182] Error syncing pod d96ed995-977f-11e7-9c7c-02334e72d0a8 ("nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"), skipping: [failed to "StartContainer" for "nginx-server" with CrashLoopBackOff: "Back-off 10s restarting failed container=nginx-server pod=nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
- Sep 12 18:00:07 k8-app-0.example.com kubelet[1293]: , failed to "StartContainer" for "regup" with CrashLoopBackOff: "Back-off 10s restarting failed container=regup pod=nginx-deployment-1224009977-p3btz_default(d96ed995-977f-11e7-9c7c-02334e72d0a8)"
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement