Giter Club home page Giter Club logo

Comments (54)

markusthoemmes avatar markusthoemmes commented on August 28, 2024 1

Just seen another occurrence of containerd talking about an exit code but the respective process still be running.

Dec 13 18:58:56 fancy-machine containerd[602]: time="2023-12-13T18:58:56.486265600Z" level=info msg="CreateContainer within sandbox \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" for &ContainerMetadata{Name:blog-os,Attempt:0,} returns container id \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 13 18:58:56 fancy-machine containerd[602]: time="2023-12-13T18:58:56.487074018Z" level=info msg="StartContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 13 18:58:56 fancy-machine containerd[602]: time="2023-12-13T18:58:56.572780483Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a pid=29280
Dec 13 18:58:56 fancy-machine containerd[602]: time="2023-12-13T18:58:56.870377321Z" level=info msg="StartContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" returns successfully"
Dec 27 10:43:55 fancy-machine containerd[602]: time="2023-12-27T10:43:55.203486189Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 10:43:55 fancy-machine containerd[602]: time="2023-12-27T10:43:55.268364859Z" level=info msg="Stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with signal terminated"
Dec 27 10:43:58 fancy-machine containerd[602]: time="2023-12-27T10:43:58.132246202Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:43:58 fancy-machine containerd[602]: time="2023-12-27T10:43:58.132643821Z" level=error msg="failed to handle container TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}" error="failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:43:59 fancy-machine containerd[602]: time="2023-12-27T10:43:59.320747851Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:43:59 fancy-machine containerd[602]: time="2023-12-27T10:43:59.360060515Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:44:01 fancy-machine containerd[602]: time="2023-12-27T10:44:01.920681887Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:44:01 fancy-machine containerd[602]: time="2023-12-27T10:44:01.921129885Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:44:04 fancy-machine containerd[602]: time="2023-12-27T10:44:04.320689694Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:44:04 fancy-machine containerd[602]: time="2023-12-27T10:44:04.354919559Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:44:06 fancy-machine containerd[602]: time="2023-12-27T10:44:06.908520449Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:44:06 fancy-machine containerd[602]: time="2023-12-27T10:44:06.908862175Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:44:11 fancy-machine containerd[602]: time="2023-12-27T10:44:11.321023051Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:44:11 fancy-machine containerd[602]: time="2023-12-27T10:44:11.366364251Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:44:13 fancy-machine containerd[602]: time="2023-12-27T10:44:13.900816800Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:44:13 fancy-machine containerd[602]: time="2023-12-27T10:44:13.901147811Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:44:22 fancy-machine containerd[602]: time="2023-12-27T10:44:22.320618783Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:44:22 fancy-machine containerd[602]: time="2023-12-27T10:44:22.350471321Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:44:24 fancy-machine containerd[602]: time="2023-12-27T10:44:24.890361031Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:44:24 fancy-machine containerd[602]: time="2023-12-27T10:44:24.890671507Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:44:25 fancy-machine containerd[602]: time="2023-12-27T10:44:25.359731943Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 10:44:41 fancy-machine containerd[602]: time="2023-12-27T10:44:41.320578265Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:44:41 fancy-machine containerd[602]: time="2023-12-27T10:44:41.367188947Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:44:43 fancy-machine containerd[602]: time="2023-12-27T10:44:43.916552913Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:44:43 fancy-machine containerd[602]: time="2023-12-27T10:44:43.916868097Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:45:16 fancy-machine containerd[602]: time="2023-12-27T10:45:16.321313852Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:45:16 fancy-machine containerd[602]: time="2023-12-27T10:45:16.351156900Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:45:18 fancy-machine containerd[602]: time="2023-12-27T10:45:18.893620090Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:45:18 fancy-machine containerd[602]: time="2023-12-27T10:45:18.894006674Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:46:23 fancy-machine containerd[602]: time="2023-12-27T10:46:23.321499647Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:46:23 fancy-machine containerd[602]: time="2023-12-27T10:46:23.350618908Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:46:25 fancy-machine containerd[602]: time="2023-12-27T10:46:25.897051670Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:46:25 fancy-machine containerd[602]: time="2023-12-27T10:46:25.897450120Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:48:34 fancy-machine containerd[602]: time="2023-12-27T10:48:34.320550779Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:48:34 fancy-machine containerd[602]: time="2023-12-27T10:48:34.359004059Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:48:36 fancy-machine containerd[602]: time="2023-12-27T10:48:36.909530215Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:48:36 fancy-machine containerd[602]: time="2023-12-27T10:48:36.910409621Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:52:53 fancy-machine containerd[602]: time="2023-12-27T10:52:53.320863379Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:52:53 fancy-machine containerd[602]: time="2023-12-27T10:52:53.351557628Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:52:55 fancy-machine containerd[602]: time="2023-12-27T10:52:55.900714720Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:52:55 fancy-machine containerd[602]: time="2023-12-27T10:52:55.900986827Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:57:56 fancy-machine containerd[602]: time="2023-12-27T10:57:56.321356842Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 10:57:56 fancy-machine containerd[602]: time="2023-12-27T10:57:56.355600076Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 10:57:58 fancy-machine containerd[602]: time="2023-12-27T10:57:58.902560761Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 10:57:58 fancy-machine containerd[602]: time="2023-12-27T10:57:58.903009915Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 10:59:25 fancy-machine containerd[602]: time="2023-12-27T10:59:25.205816983Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 10:59:25 fancy-machine containerd[602]: time="2023-12-27T10:59:25.206394113Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 11:02:59 fancy-machine containerd[602]: time="2023-12-27T11:02:59.321210885Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:02:59 fancy-machine containerd[602]: time="2023-12-27T11:02:59.346403108Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:03:01 fancy-machine containerd[602]: time="2023-12-27T11:03:01.882268105Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:03:01 fancy-machine containerd[602]: time="2023-12-27T11:03:01.882607608Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:08:02 fancy-machine containerd[602]: time="2023-12-27T11:08:02.320580479Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:08:02 fancy-machine containerd[602]: time="2023-12-27T11:08:02.347364547Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:08:04 fancy-machine containerd[602]: time="2023-12-27T11:08:04.897192996Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:08:04 fancy-machine containerd[602]: time="2023-12-27T11:08:04.897559001Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:13:05 fancy-machine containerd[602]: time="2023-12-27T11:13:05.320773336Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:13:05 fancy-machine containerd[602]: time="2023-12-27T11:13:05.362421076Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:13:07 fancy-machine containerd[602]: time="2023-12-27T11:13:07.916675967Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:13:07 fancy-machine containerd[602]: time="2023-12-27T11:13:07.917434204Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:14:25 fancy-machine containerd[602]: time="2023-12-27T11:14:25.205525089Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 11:14:25 fancy-machine containerd[602]: time="2023-12-27T11:14:25.259748851Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 11:14:25 fancy-machine containerd[602]: time="2023-12-27T11:14:25.260302697Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 11:14:55 fancy-machine containerd[602]: time="2023-12-27T11:14:55.260464684Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 11:18:08 fancy-machine containerd[602]: time="2023-12-27T11:18:08.321553442Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:18:08 fancy-machine containerd[602]: time="2023-12-27T11:18:08.346909104Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:18:10 fancy-machine containerd[602]: time="2023-12-27T11:18:10.885180857Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:18:10 fancy-machine containerd[602]: time="2023-12-27T11:18:10.885661545Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:23:11 fancy-machine containerd[602]: time="2023-12-27T11:23:11.320754927Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:23:11 fancy-machine containerd[602]: time="2023-12-27T11:23:11.370576174Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:23:13 fancy-machine containerd[602]: time="2023-12-27T11:23:13.900365136Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:23:13 fancy-machine containerd[602]: time="2023-12-27T11:23:13.900645453Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:28:14 fancy-machine containerd[602]: time="2023-12-27T11:28:14.321353075Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:28:14 fancy-machine containerd[602]: time="2023-12-27T11:28:14.372025568Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:28:16 fancy-machine containerd[602]: time="2023-12-27T11:28:16.916040260Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:28:16 fancy-machine containerd[602]: time="2023-12-27T11:28:16.916423825Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:29:55 fancy-machine containerd[602]: time="2023-12-27T11:29:55.260051084Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 11:29:55 fancy-machine containerd[602]: time="2023-12-27T11:29:55.261764027Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 11:33:17 fancy-machine containerd[602]: time="2023-12-27T11:33:17.320751273Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:33:17 fancy-machine containerd[602]: time="2023-12-27T11:33:17.354596559Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:33:19 fancy-machine containerd[602]: time="2023-12-27T11:33:19.892244789Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:33:19 fancy-machine containerd[602]: time="2023-12-27T11:33:19.892550480Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:38:20 fancy-machine containerd[602]: time="2023-12-27T11:38:20.321402248Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:38:20 fancy-machine containerd[602]: time="2023-12-27T11:38:20.371695346Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:38:22 fancy-machine containerd[602]: time="2023-12-27T11:38:22.928460729Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:38:22 fancy-machine containerd[602]: time="2023-12-27T11:38:22.928855511Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:43:23 fancy-machine containerd[602]: time="2023-12-27T11:43:23.320694122Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:43:23 fancy-machine containerd[602]: time="2023-12-27T11:43:23.358964313Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:43:25 fancy-machine containerd[602]: time="2023-12-27T11:43:25.928893030Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:43:25 fancy-machine containerd[602]: time="2023-12-27T11:43:25.929203012Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:44:55 fancy-machine containerd[602]: time="2023-12-27T11:44:55.265958770Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 11:44:56 fancy-machine containerd[602]: time="2023-12-27T11:44:56.072959725Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 11:44:56 fancy-machine containerd[602]: time="2023-12-27T11:44:56.073481240Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 11:45:26 fancy-machine containerd[602]: time="2023-12-27T11:45:26.074257606Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 11:48:26 fancy-machine containerd[602]: time="2023-12-27T11:48:26.320886179Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:48:26 fancy-machine containerd[602]: time="2023-12-27T11:48:26.354526275Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:48:28 fancy-machine containerd[602]: time="2023-12-27T11:48:28.900804992Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:48:28 fancy-machine containerd[602]: time="2023-12-27T11:48:28.901232256Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:53:29 fancy-machine containerd[602]: time="2023-12-27T11:53:29.321066456Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:53:29 fancy-machine containerd[602]: time="2023-12-27T11:53:29.359382279Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:53:31 fancy-machine containerd[602]: time="2023-12-27T11:53:31.917832563Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:53:31 fancy-machine containerd[602]: time="2023-12-27T11:53:31.918244298Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 11:58:32 fancy-machine containerd[602]: time="2023-12-27T11:58:32.321443987Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 11:58:32 fancy-machine containerd[602]: time="2023-12-27T11:58:32.370647507Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 11:58:34 fancy-machine containerd[602]: time="2023-12-27T11:58:34.936038349Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 11:58:34 fancy-machine containerd[602]: time="2023-12-27T11:58:34.936321926Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:00:26 fancy-machine containerd[602]: time="2023-12-27T12:00:26.073355453Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = DeadlineExceeded desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context deadline exceeded"
Dec 27 12:00:26 fancy-machine containerd[602]: time="2023-12-27T12:00:26.074777747Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 12:03:35 fancy-machine containerd[602]: time="2023-12-27T12:03:35.321711497Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:03:35 fancy-machine containerd[602]: time="2023-12-27T12:03:35.367398394Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:03:37 fancy-machine containerd[602]: time="2023-12-27T12:03:37.908786189Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:03:37 fancy-machine containerd[602]: time="2023-12-27T12:03:37.909169990Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:08:38 fancy-machine containerd[602]: time="2023-12-27T12:08:38.321919815Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:08:38 fancy-machine containerd[602]: time="2023-12-27T12:08:38.375252620Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:08:40 fancy-machine containerd[602]: time="2023-12-27T12:08:40.920399864Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:08:40 fancy-machine containerd[602]: time="2023-12-27T12:08:40.921266937Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:13:41 fancy-machine containerd[602]: time="2023-12-27T12:13:41.321043471Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:13:41 fancy-machine containerd[602]: time="2023-12-27T12:13:41.346366882Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:13:43 fancy-machine containerd[602]: time="2023-12-27T12:13:43.881976188Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:13:43 fancy-machine containerd[602]: time="2023-12-27T12:13:43.882339531Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:15:26 fancy-machine containerd[602]: time="2023-12-27T12:15:26.074740369Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 12:15:26 fancy-machine containerd[602]: time="2023-12-27T12:15:26.835437084Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 12:15:26 fancy-machine containerd[602]: time="2023-12-27T12:15:26.835963491Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 12:15:56 fancy-machine containerd[602]: time="2023-12-27T12:15:56.836092777Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 12:18:44 fancy-machine containerd[602]: time="2023-12-27T12:18:44.320986197Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:18:44 fancy-machine containerd[602]: time="2023-12-27T12:18:44.362803161Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:18:46 fancy-machine containerd[602]: time="2023-12-27T12:18:46.904671420Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:18:46 fancy-machine containerd[602]: time="2023-12-27T12:18:46.904984553Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:23:47 fancy-machine containerd[602]: time="2023-12-27T12:23:47.320968708Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:23:47 fancy-machine containerd[602]: time="2023-12-27T12:23:47.374836110Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:23:49 fancy-machine containerd[602]: time="2023-12-27T12:23:49.923933928Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:23:49 fancy-machine containerd[602]: time="2023-12-27T12:23:49.924276692Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:28:50 fancy-machine containerd[602]: time="2023-12-27T12:28:50.321219302Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:28:50 fancy-machine containerd[602]: time="2023-12-27T12:28:50.350449374Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:28:52 fancy-machine containerd[602]: time="2023-12-27T12:28:52.900217479Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:28:52 fancy-machine containerd[602]: time="2023-12-27T12:28:52.900568956Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:30:56 fancy-machine containerd[602]: time="2023-12-27T12:30:56.836104297Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 12:30:56 fancy-machine containerd[602]: time="2023-12-27T12:30:56.836993135Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 12:33:53 fancy-machine containerd[602]: time="2023-12-27T12:33:53.321524453Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:33:53 fancy-machine containerd[602]: time="2023-12-27T12:33:53.371254331Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:33:55 fancy-machine containerd[602]: time="2023-12-27T12:33:55.912796444Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:33:55 fancy-machine containerd[602]: time="2023-12-27T12:33:55.913139484Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:38:56 fancy-machine containerd[602]: time="2023-12-27T12:38:56.321200103Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:38:56 fancy-machine containerd[602]: time="2023-12-27T12:38:56.362710078Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:38:58 fancy-machine containerd[602]: time="2023-12-27T12:38:58.913383132Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:38:58 fancy-machine containerd[602]: time="2023-12-27T12:38:58.913961441Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:43:59 fancy-machine containerd[602]: time="2023-12-27T12:43:59.321275197Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:43:59 fancy-machine containerd[602]: time="2023-12-27T12:43:59.346751004Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:44:01 fancy-machine containerd[602]: time="2023-12-27T12:44:01.901271153Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:44:01 fancy-machine containerd[602]: time="2023-12-27T12:44:01.901623502Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:45:56 fancy-machine containerd[602]: time="2023-12-27T12:45:56.836786437Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 12:45:57 fancy-machine containerd[602]: time="2023-12-27T12:45:57.502235878Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 12:45:57 fancy-machine containerd[602]: time="2023-12-27T12:45:57.502711217Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 12:46:27 fancy-machine containerd[602]: time="2023-12-27T12:46:27.503451959Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 12:49:02 fancy-machine containerd[602]: time="2023-12-27T12:49:02.320594055Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:49:02 fancy-machine containerd[602]: time="2023-12-27T12:49:02.351580645Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:49:04 fancy-machine containerd[602]: time="2023-12-27T12:49:04.904325183Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:49:04 fancy-machine containerd[602]: time="2023-12-27T12:49:04.905173634Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:54:05 fancy-machine containerd[602]: time="2023-12-27T12:54:05.320885430Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:54:05 fancy-machine containerd[602]: time="2023-12-27T12:54:05.371477781Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:54:07 fancy-machine containerd[602]: time="2023-12-27T12:54:07.908998846Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:54:07 fancy-machine containerd[602]: time="2023-12-27T12:54:07.909376655Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 12:59:08 fancy-machine containerd[602]: time="2023-12-27T12:59:08.321099573Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 12:59:08 fancy-machine containerd[602]: time="2023-12-27T12:59:08.347282535Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 12:59:10 fancy-machine containerd[602]: time="2023-12-27T12:59:10.894398706Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 12:59:10 fancy-machine containerd[602]: time="2023-12-27T12:59:10.894756351Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:01:27 fancy-machine containerd[602]: time="2023-12-27T13:01:27.502578344Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = DeadlineExceeded desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context deadline exceeded"
Dec 27 13:01:27 fancy-machine containerd[602]: time="2023-12-27T13:01:27.503706816Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 13:04:11 fancy-machine containerd[602]: time="2023-12-27T13:04:11.320678920Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:04:11 fancy-machine containerd[602]: time="2023-12-27T13:04:11.362571404Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:04:13 fancy-machine containerd[602]: time="2023-12-27T13:04:13.908003011Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:04:13 fancy-machine containerd[602]: time="2023-12-27T13:04:13.908318234Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:09:14 fancy-machine containerd[602]: time="2023-12-27T13:09:14.321338288Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:09:14 fancy-machine containerd[602]: time="2023-12-27T13:09:14.350737131Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:09:16 fancy-machine containerd[602]: time="2023-12-27T13:09:16.888174691Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:09:16 fancy-machine containerd[602]: time="2023-12-27T13:09:16.888492638Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:14:17 fancy-machine containerd[602]: time="2023-12-27T13:14:17.321216986Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:14:17 fancy-machine containerd[602]: time="2023-12-27T13:14:17.346786004Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:14:19 fancy-machine containerd[602]: time="2023-12-27T13:14:19.895798019Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:14:19 fancy-machine containerd[602]: time="2023-12-27T13:14:19.896169463Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:16:27 fancy-machine containerd[602]: time="2023-12-27T13:16:27.503653058Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context deadline exceeded"
Dec 27 13:16:28 fancy-machine containerd[602]: time="2023-12-27T13:16:28.219746794Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 13:16:28 fancy-machine containerd[602]: time="2023-12-27T13:16:28.220240937Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 13:16:58 fancy-machine containerd[602]: time="2023-12-27T13:16:58.220750090Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 13:19:20 fancy-machine containerd[602]: time="2023-12-27T13:19:20.320950911Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:19:20 fancy-machine containerd[602]: time="2023-12-27T13:19:20.346618034Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:19:22 fancy-machine containerd[602]: time="2023-12-27T13:19:22.889375596Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:19:22 fancy-machine containerd[602]: time="2023-12-27T13:19:22.889668554Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:24:23 fancy-machine containerd[602]: time="2023-12-27T13:24:23.320983043Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:24:23 fancy-machine containerd[602]: time="2023-12-27T13:24:23.354973509Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:24:25 fancy-machine containerd[602]: time="2023-12-27T13:24:25.883506142Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:24:25 fancy-machine containerd[602]: time="2023-12-27T13:24:25.883792686Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:29:26 fancy-machine containerd[602]: time="2023-12-27T13:29:26.321281752Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:29:26 fancy-machine containerd[602]: time="2023-12-27T13:29:26.351097506Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:29:28 fancy-machine containerd[602]: time="2023-12-27T13:29:28.893908947Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:29:28 fancy-machine containerd[602]: time="2023-12-27T13:29:28.894340199Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:31:58 fancy-machine containerd[602]: time="2023-12-27T13:31:58.219876189Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 13:31:58 fancy-machine containerd[602]: time="2023-12-27T13:31:58.220803715Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 13:34:29 fancy-machine containerd[602]: time="2023-12-27T13:34:29.320782020Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:34:29 fancy-machine containerd[602]: time="2023-12-27T13:34:29.366745945Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:34:31 fancy-machine containerd[602]: time="2023-12-27T13:34:31.899591196Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:34:31 fancy-machine containerd[602]: time="2023-12-27T13:34:31.899929978Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:39:32 fancy-machine containerd[602]: time="2023-12-27T13:39:32.320611674Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:39:32 fancy-machine containerd[602]: time="2023-12-27T13:39:32.354708863Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:39:34 fancy-machine containerd[602]: time="2023-12-27T13:39:34.913233616Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:39:34 fancy-machine containerd[602]: time="2023-12-27T13:39:34.913591171Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:44:35 fancy-machine containerd[602]: time="2023-12-27T13:44:35.321578028Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:44:35 fancy-machine containerd[602]: time="2023-12-27T13:44:35.346286702Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:44:37 fancy-machine containerd[602]: time="2023-12-27T13:44:37.880072596Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:44:37 fancy-machine containerd[602]: time="2023-12-27T13:44:37.880363856Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:46:58 fancy-machine containerd[602]: time="2023-12-27T13:46:58.221049040Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 13:46:58 fancy-machine containerd[602]: time="2023-12-27T13:46:58.892947368Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 13:46:58 fancy-machine containerd[602]: time="2023-12-27T13:46:58.894185933Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 13:47:28 fancy-machine containerd[602]: time="2023-12-27T13:47:28.894779036Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 13:49:38 fancy-machine containerd[602]: time="2023-12-27T13:49:38.321099753Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:49:38 fancy-machine containerd[602]: time="2023-12-27T13:49:38.358754481Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:49:40 fancy-machine containerd[602]: time="2023-12-27T13:49:40.893235371Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:49:40 fancy-machine containerd[602]: time="2023-12-27T13:49:40.893549795Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:54:41 fancy-machine containerd[602]: time="2023-12-27T13:54:41.320563512Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:54:41 fancy-machine containerd[602]: time="2023-12-27T13:54:41.350864584Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:54:43 fancy-machine containerd[602]: time="2023-12-27T13:54:43.896043613Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:54:43 fancy-machine containerd[602]: time="2023-12-27T13:54:43.896324831Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 13:59:44 fancy-machine containerd[602]: time="2023-12-27T13:59:44.321013426Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 13:59:44 fancy-machine containerd[602]: time="2023-12-27T13:59:44.362207661Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 13:59:46 fancy-machine containerd[602]: time="2023-12-27T13:59:46.925443875Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 13:59:46 fancy-machine containerd[602]: time="2023-12-27T13:59:46.925802320Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:02:28 fancy-machine containerd[602]: time="2023-12-27T14:02:28.892672822Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 14:02:28 fancy-machine containerd[602]: time="2023-12-27T14:02:28.893465703Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 14:04:47 fancy-machine containerd[602]: time="2023-12-27T14:04:47.321093157Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:04:47 fancy-machine containerd[602]: time="2023-12-27T14:04:47.355111898Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:04:49 fancy-machine containerd[602]: time="2023-12-27T14:04:49.899840172Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:04:49 fancy-machine containerd[602]: time="2023-12-27T14:04:49.900190557Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:09:50 fancy-machine containerd[602]: time="2023-12-27T14:09:50.321447569Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:09:50 fancy-machine containerd[602]: time="2023-12-27T14:09:50.358521671Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:09:52 fancy-machine containerd[602]: time="2023-12-27T14:09:52.916518630Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:09:52 fancy-machine containerd[602]: time="2023-12-27T14:09:52.917310243Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:14:53 fancy-machine containerd[602]: time="2023-12-27T14:14:53.320678836Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:14:53 fancy-machine containerd[602]: time="2023-12-27T14:14:53.355578686Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:14:55 fancy-machine containerd[602]: time="2023-12-27T14:14:55.909235080Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:14:55 fancy-machine containerd[602]: time="2023-12-27T14:14:55.909630385Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:17:28 fancy-machine containerd[602]: time="2023-12-27T14:17:28.893473917Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = Canceled desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 14:17:29 fancy-machine containerd[602]: time="2023-12-27T14:17:29.587314883Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 14:17:29 fancy-machine containerd[602]: time="2023-12-27T14:17:29.587868474Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 14:17:59 fancy-machine containerd[602]: time="2023-12-27T14:17:59.588892538Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 14:19:56 fancy-machine containerd[602]: time="2023-12-27T14:19:56.321069935Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:19:56 fancy-machine containerd[602]: time="2023-12-27T14:19:56.362345208Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:19:58 fancy-machine containerd[602]: time="2023-12-27T14:19:58.900210494Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:19:58 fancy-machine containerd[602]: time="2023-12-27T14:19:58.900539062Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:24:59 fancy-machine containerd[602]: time="2023-12-27T14:24:59.321439176Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:24:59 fancy-machine containerd[602]: time="2023-12-27T14:24:59.355937790Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:25:01 fancy-machine containerd[602]: time="2023-12-27T14:25:01.893093320Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:25:01 fancy-machine containerd[602]: time="2023-12-27T14:25:01.893396616Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:30:02 fancy-machine containerd[602]: time="2023-12-27T14:30:02.321217709Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:30:02 fancy-machine containerd[602]: time="2023-12-27T14:30:02.366723864Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:30:04 fancy-machine containerd[602]: time="2023-12-27T14:30:04.912769844Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:30:04 fancy-machine containerd[602]: time="2023-12-27T14:30:04.912460706Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:32:59 fancy-machine containerd[602]: time="2023-12-27T14:32:59.588014220Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = DeadlineExceeded desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context deadline exceeded"
Dec 27 14:32:59 fancy-machine containerd[602]: time="2023-12-27T14:32:59.589500817Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 14:35:05 fancy-machine containerd[602]: time="2023-12-27T14:35:05.320959295Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:35:05 fancy-machine containerd[602]: time="2023-12-27T14:35:05.363030933Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:35:07 fancy-machine containerd[602]: time="2023-12-27T14:35:07.899926424Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:35:07 fancy-machine containerd[602]: time="2023-12-27T14:35:07.900197100Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:40:08 fancy-machine containerd[602]: time="2023-12-27T14:40:08.321003596Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:40:08 fancy-machine containerd[602]: time="2023-12-27T14:40:08.351179281Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:40:10 fancy-machine containerd[602]: time="2023-12-27T14:40:10.891890379Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:40:10 fancy-machine containerd[602]: time="2023-12-27T14:40:10.892196676Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:45:11 fancy-machine containerd[602]: time="2023-12-27T14:45:11.320979119Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:45:11 fancy-machine containerd[602]: time="2023-12-27T14:45:11.346236823Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:45:13 fancy-machine containerd[602]: time="2023-12-27T14:45:13.884235147Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:45:13 fancy-machine containerd[602]: time="2023-12-27T14:45:13.884568252Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:47:59 fancy-machine containerd[602]: time="2023-12-27T14:47:59.589149269Z" level=error msg="StopPodSandbox for \"18517e9ae953629c515c5381bccb63cf7d66536fc97dfeabd5d1d4a792340b21\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context deadline exceeded"
Dec 27 14:48:00 fancy-machine containerd[602]: time="2023-12-27T14:48:00.312583308Z" level=info msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" with timeout 30 (s)"
Dec 27 14:48:00 fancy-machine containerd[602]: time="2023-12-27T14:48:00.313085639Z" level=info msg="Skipping the sending of signal terminated to container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" because a prior stop with timeout>0 request already sent the signal"
Dec 27 14:48:30 fancy-machine containerd[602]: time="2023-12-27T14:48:30.313462730Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 14:50:14 fancy-machine containerd[602]: time="2023-12-27T14:50:14.321509361Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:50:14 fancy-machine containerd[602]: time="2023-12-27T14:50:14.362522754Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:50:16 fancy-machine containerd[602]: time="2023-12-27T14:50:16.912842802Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:50:16 fancy-machine containerd[602]: time="2023-12-27T14:50:16.913269262Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 14:55:17 fancy-machine containerd[602]: time="2023-12-27T14:55:17.321254400Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 14:55:17 fancy-machine containerd[602]: time="2023-12-27T14:55:17.355692313Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 14:55:19 fancy-machine containerd[602]: time="2023-12-27T14:55:19.909404357Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 14:55:19 fancy-machine containerd[602]: time="2023-12-27T14:55:19.909761889Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 15:00:20 fancy-machine containerd[602]: time="2023-12-27T15:00:20.321454327Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 15:00:20 fancy-machine containerd[602]: time="2023-12-27T15:00:20.362744854Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 15:00:22 fancy-machine containerd[602]: time="2023-12-27T15:00:22.904767760Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 15:00:22 fancy-machine containerd[602]: time="2023-12-27T15:00:22.905090613Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 15:03:30 fancy-machine containerd[602]: time="2023-12-27T15:03:30.312782700Z" level=error msg="StopContainer for \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" to be killed: wait container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": context canceled"
Dec 27 15:03:30 fancy-machine containerd[602]: time="2023-12-27T15:03:30.315478522Z" level=info msg="Kill container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\""
Dec 27 15:05:23 fancy-machine containerd[602]: time="2023-12-27T15:05:23.321157730Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 15:05:23 fancy-machine containerd[602]: time="2023-12-27T15:05:23.350507841Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 15:05:25 fancy-machine containerd[602]: time="2023-12-27T15:05:25.891997844Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 15:05:25 fancy-machine containerd[602]: time="2023-12-27T15:05:25.892299576Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"
Dec 27 15:10:26 fancy-machine containerd[602]: time="2023-12-27T15:10:26.320983632Z" level=info msg="TaskExit event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856}"
Dec 27 15:10:26 fancy-machine containerd[602]: time="2023-12-27T15:10:26.352448171Z" level=warning msg="Ignoring error killing container \"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Dec 27 15:10:28 fancy-machine containerd[602]: time="2023-12-27T15:10:28.897571677Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Dec 27 15:10:28 fancy-machine containerd[602]: time="2023-12-27T15:10:28.897920677Z" level=error msg="Failed to handle backOff event container_id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" id:\"dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a\" pid:29098 exit_status:143 exited_at:{seconds:1703673835 nanos:457869856} for dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/dd57742f947c373043ac39c78d0ee7a1d01f02a92c1df4b22bff704dc9cc754a/rootfs: device or resource busy: unknown"

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024 1

Thanks @ayushr2. Here we go, I've gotten it to reproduce with the below PodSpec. It's been a little finicky so I left seemingly unrelated parts in it as well. FWIW, I've tested this on the latest gVisor release as well, just in case. Same behavior there.

So as a reminder:

K8s version: 1.28.2
containerd version: 1.7.8
gvisor version: release-20240109.0
apiVersion: v1
kind: Pod
metadata:
  name: repro
spec:
  restartPolicy: Always
  containers:
  - name: repro
    image: shlinkio/shlink@sha256:c70cf1b37087581cfcb7963d74d6c13fbee8555a7b10aa4af0493e70ade41202
    env:
    - name: INITIAL_API_KEY
      value: foobar
    - name: DEFAULT_DOMAIN
      value: foo.bar
    resources:
      limits:
        cpu: "1"
        ephemeral-storage: 4G
        memory: 2Gi
      requests:
        cpu: 200m
        ephemeral-storage: 400M
        memory: "858993459"

You can trip the issue by running

$ sleep 60 # Not sure if this is really doing anything, but it seems to have improved the likelihood of it happening
$ kubectl exec repro -- dd if=/dev/zero of=./big_file bs=4k iflag=fullblock,count_bytes count=10G

Sometimes, this works fine, leaving the pod in an Error state. Sometimes, the pod doesn't hang but goes into a ContainerStatusUnknown state (which feels wrong too) and sometimes the container just stays Running (even though it can't be execed into anymore). If you try to delete it then, the same hang as introduced above happens.

For some reason that is still beyond me, it happens much more consistently when doing the same thing to the pods of the following deployment

apiVersion: v1
data:
  DEFAULT_DOMAIN: Zm9vLmJhcg==
  GEOLITE_LICENSE_KEY: ""
  INITIAL_API_KEY: Zm9vYmFy
kind: Secret
metadata:
  name: shlinkio-shlink-secrets
type: Opaque
---
apiVersion: apps/v1
kind: Deployment
metadata:
  name: shlinkio-shlink
spec:
  minReadySeconds: 10
  progressDeadlineSeconds: 730
  replicas: 1
  revisionHistoryLimit: 10
  selector:
    matchLabels:
      app-component: shlinkio-shlink
  strategy:
    rollingUpdate:
      maxSurge: 25%
      maxUnavailable: 25%
    type: RollingUpdate
  template:
    metadata:
      annotations:
        env-var-hash: 1c3af241bb8862b3fc57e28a31797c39
      creationTimestamp: null
      labels:
        app-component: shlinkio-shlink
    spec:
      automountServiceAccountToken: false
      containers:
      - env:
        - name: CNB_PROCESS_TYPE
          value: web
        - name: KUBERNETES_PORT
        - name: KUBERNETES_PORT_443_TCP
        - name: KUBERNETES_PORT_443_TCP_ADDR
        - name: KUBERNETES_PORT_443_TCP_PORT
        - name: KUBERNETES_PORT_443_TCP_PROTO
        - name: KUBERNETES_SERVICE_HOST
        - name: KUBERNETES_SERVICE_PORT
        - name: KUBERNETES_SERVICE_PORT_HTTPS
        - name: PORT
          value: "8080"
        envFrom:
        - secretRef:
            name: shlinkio-shlink-secrets
        image: shlinkio/shlink@sha256:c70cf1b37087581cfcb7963d74d6c13fbee8555a7b10aa4af0493e70ade41202
        imagePullPolicy: IfNotPresent
        name: shlinkio-shlink
        ports:
        - containerPort: 8080
          name: http-8080
          protocol: TCP
        readinessProbe:
          failureThreshold: 9
          initialDelaySeconds: 30
          periodSeconds: 10
          successThreshold: 1
          tcpSocket:
            port: 8080
          timeoutSeconds: 5
        resources:
          limits:
            cpu: "1"
            ephemeral-storage: 4G
            memory: 2Gi
          requests:
            cpu: 200m
            ephemeral-storage: 400M
            memory: "858993459"
        terminationMessagePath: /dev/termination-log
        terminationMessagePolicy: File
      enableServiceLinks: false
      restartPolicy: Always
      schedulerName: default-scheduler
      securityContext: {}
      terminationGracePeriodSeconds: 30
      topologySpreadConstraints:
      - labelSelector:
          matchLabels:
            app-component: shlinkio-shlink
        maxSkew: 1
        topologyKey: kubernetes.io/hostname
        whenUnsatisfiable: ScheduleAnyway

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024 1

We're in the process of rolling out overlay2=off now to get see if this improves the behavior at scale. I'll report back eventually. @zpavlinovic maybe that's something you could try out as well, if you're seeing the same symptoms.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024 1

overlay2=none should be the right setting. Thanks, let me know.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

From the logs and following a bit more of what the shim does etc, it looks like the shim thought it had actually killed the correct PID and it even prints an exit code of 143, which would've been a SIGTERM exit. Buuuut, the respective process was definitely still running. I'm puzzled.

In another instance of the same issue, containerd did not print an exit code, but again the correct PID and a exited_at timestamp.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Here some more information. This time, we didn't get the umount issues (which I believe are only a followup problem on the "kill not being effective" anyway) but I got the chance to manually run runsc kill with debug logging enabled. It doesn't really tell me anything useful though. In this case, containerd did not think that the process was stopped. Both runsc kill (first to the container, then to the sandbox directly) did nothing to actually stop either container or sandbox. It looked like the runsc-gofer went away though.

$ journalctl -u containerd | grep ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.640759516Z" level=info msg="CreateContainer within sandbox \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\" for &ContainerMetadata{Name:service,Attempt:0,} returns container id \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.641417411Z" level=info msg="StartContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.736554591Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478 pid=884345
Dec 20 11:32:22 fancy-machine containerd[599]: time="2023-12-20T11:32:22.050765688Z" level=info msg="StartContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" returns successfully"
Dec 22 09:20:41 fancy-machine containerd[599]: time="2023-12-22T09:20:41.057453474Z" level=info msg="StopContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" with timeout 30 (s)"
Dec 22 09:20:41 fancy-machine containerd[599]: time="2023-12-22T09:20:41.102373039Z" level=info msg="Stop container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" with signal terminated"
Dec 22 09:21:11 fancy-machine containerd[599]: time="2023-12-22T09:21:11.182849104Z" level=info msg="Kill container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 22 09:36:11 fancy-machine containerd[599]: time="2023-12-22T09:36:11.057405548Z" level=error msg="StopContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" to be killed: wait container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": context canceled"
Dec 22 09:36:11 fancy-machine containerd[599]: time="2023-12-22T09:36:11.204889744Z" level=info msg="Kill container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 22 09:51:11 fancy-machine containerd[599]: time="2023-12-22T09:51:11.057962413Z" level=error msg="StopPodSandbox for \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\" failed" error="rpc error: code = Canceled desc = failed to stop container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": an error occurs during waiting for container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" to be killed: wait container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": context canceled"
Dec 22 09:51:11 fancy-machine containerd[599]: time="2023-12-22T09:51:11.739212354Z" level=info msg="StopContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" with timeout 30 (s)"
Dec 22 09:51:11 fancy-machine containerd[599]: time="2023-12-22T09:51:11.794469456Z" level=info msg="Skipping the sending of signal terminated to container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" because a prior stop with timeout>0 request already sent the signal"
Dec 22 09:51:41 fancy-machine containerd[599]: time="2023-12-22T09:51:41.795213941Z" level=info msg="Kill container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 22 10:06:41 fancy-machine containerd[599]: time="2023-12-22T10:06:41.739067786Z" level=error msg="StopContainer for \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" to be killed: wait container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": context canceled"
Dec 22 10:06:41 fancy-machine containerd[599]: time="2023-12-22T10:06:41.792177189Z" level=info msg="Kill container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
$ journalctl -u containerd | grep 52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.105809959Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707 pid=884178
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.555317685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:service-c99ddb4d9-nsjt6,Uid:246d774a-ddc2-42db-9243-399e6089a049,Namespace:app-b077b29d-1ed1-4cd4-a85a-50dda755e4b1,Attempt:0,} returns sandbox id \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\""
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.568671853Z" level=info msg="CreateContainer within sandbox \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\" for container &ContainerMetadata{Name:service,Attempt:0,}"
Dec 20 11:32:21 fancy-machine containerd[599]: time="2023-12-20T11:32:21.640759516Z" level=info msg="CreateContainer within sandbox \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\" for &ContainerMetadata{Name:service,Attempt:0,} returns container id \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\""
Dec 22 09:36:11 fancy-machine containerd[599]: time="2023-12-22T09:36:11.057573038Z" level=info msg="StopPodSandbox for \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\""
Dec 22 09:51:11 fancy-machine containerd[599]: time="2023-12-22T09:51:11.057962413Z" level=error msg="StopPodSandbox for \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\" failed" error="rpc error: code = Canceled desc = failed to stop container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": an error occurs during waiting for container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\" to be killed: wait container \"ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478\": context canceled"
Dec 22 10:06:41 fancy-machine containerd[599]: time="2023-12-22T10:06:41.739454662Z" level=info msg="StopPodSandbox for \"52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707\""
I1222 10:05:24.744173  2930321 main.go:189] ***************************
I1222 10:05:24.744217  2930321 main.go:190] Args: [runsc --debug --debug-log=./ --root=/run/containerd/runsc/k8s.io kill ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478]
I1222 10:05:24.744230  2930321 main.go:191] Version release-20231113.0-41-g9c6f50d59e02
I1222 10:05:24.744236  2930321 main.go:192] GOOS: linux
I1222 10:05:24.744241  2930321 main.go:193] GOARCH: amd64
I1222 10:05:24.744247  2930321 main.go:194] PID: 2930321
I1222 10:05:24.744253  2930321 main.go:195] UID: 0, GID: 0
I1222 10:05:24.744258  2930321 main.go:196] Configuration:
I1222 10:05:24.744264  2930321 main.go:197] 		RootDir: /run/containerd/runsc/k8s.io
I1222 10:05:24.744269  2930321 main.go:198] 		Platform: systrap
I1222 10:05:24.744274  2930321 main.go:199] 		FileAccess: exclusive
I1222 10:05:24.744281  2930321 main.go:200] 		Directfs: true
I1222 10:05:24.744286  2930321 main.go:201] 		Overlay: root:self
I1222 10:05:24.744293  2930321 main.go:202] 		Network: sandbox, logging: false
I1222 10:05:24.744306  2930321 main.go:203] 		Strace: false, max size: 1024, syscalls:
I1222 10:05:24.744318  2930321 main.go:204] 		IOURING: false
I1222 10:05:24.744328  2930321 main.go:205] 		Debug: true
I1222 10:05:24.744336  2930321 main.go:206] 		Systemd: false
I1222 10:05:24.744341  2930321 main.go:207] ***************************
D1222 10:05:24.744370  2930321 state_file.go:78] Load container, rootDir: "/run/containerd/runsc/k8s.io", id: {SandboxID: ContainerID:ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478}, opts: {Exact:false SkipCheck:false TryLock:false RootContainer:false}
D1222 10:05:24.746033  2930321 container.go:673] Signal container, cid: ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478, signal: signal 0 (0)
D1222 10:05:24.746061  2930321 sandbox.go:1211] Signal sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:05:24.746068  2930321 sandbox.go:613] Connecting to sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:05:24.746161  2930321 urpc.go:568] urpc: successfully marshalled 144 bytes.
D1222 10:05:24.801110  2930321 urpc.go:611] urpc: unmarshal success.
D1222 10:05:24.801174  2930321 container.go:673] Signal container, cid: ae3a2430d446f8ca4bdaee18e4a29d6bddaff5687a34056e4584ffda62378478, signal: terminated (15)
D1222 10:05:24.801194  2930321 sandbox.go:1211] Signal sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:05:24.801204  2930321 sandbox.go:613] Connecting to sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:05:24.801266  2930321 urpc.go:568] urpc: successfully marshalled 145 bytes.
D1222 10:05:24.802085  2930321 urpc.go:611] urpc: unmarshal success.
I1222 10:05:24.802117  2930321 main.go:224] Exiting with status: 0
I1222 10:11:04.736228  2934612 main.go:189] ***************************
I1222 10:11:04.736301  2934612 main.go:190] Args: [runsc --debug --debug-log=./ --root=/run/containerd/runsc/k8s.io kill 52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707]
I1222 10:11:04.736318  2934612 main.go:191] Version release-20231113.0-41-g9c6f50d59e02
I1222 10:11:04.736326  2934612 main.go:192] GOOS: linux
I1222 10:11:04.736333  2934612 main.go:193] GOARCH: amd64
I1222 10:11:04.736341  2934612 main.go:194] PID: 2934612
I1222 10:11:04.736350  2934612 main.go:195] UID: 0, GID: 0
I1222 10:11:04.736359  2934612 main.go:196] Configuration:
I1222 10:11:04.736366  2934612 main.go:197] 		RootDir: /run/containerd/runsc/k8s.io
I1222 10:11:04.736374  2934612 main.go:198] 		Platform: systrap
I1222 10:11:04.736382  2934612 main.go:199] 		FileAccess: exclusive
I1222 10:11:04.736391  2934612 main.go:200] 		Directfs: true
I1222 10:11:04.736399  2934612 main.go:201] 		Overlay: root:self
I1222 10:11:04.736409  2934612 main.go:202] 		Network: sandbox, logging: false
I1222 10:11:04.736426  2934612 main.go:203] 		Strace: false, max size: 1024, syscalls:
I1222 10:11:04.736436  2934612 main.go:204] 		IOURING: false
I1222 10:11:04.736444  2934612 main.go:205] 		Debug: true
I1222 10:11:04.736452  2934612 main.go:206] 		Systemd: false
I1222 10:11:04.736460  2934612 main.go:207] ***************************
D1222 10:11:04.736503  2934612 state_file.go:78] Load container, rootDir: "/run/containerd/runsc/k8s.io", id: {SandboxID: ContainerID:52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707}, opts: {Exact:false SkipCheck:false TryLock:false RootContainer:false}
D1222 10:11:04.738498  2934612 container.go:673] Signal container, cid: 52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707, signal: signal 0 (0)
D1222 10:11:04.738533  2934612 sandbox.go:1211] Signal sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:11:04.738544  2934612 sandbox.go:613] Connecting to sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:11:04.738647  2934612 urpc.go:568] urpc: successfully marshalled 144 bytes.
D1222 10:11:04.797023  2934612 urpc.go:611] urpc: unmarshal success.
D1222 10:11:04.797094  2934612 container.go:673] Signal container, cid: 52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707, signal: terminated (15)
D1222 10:11:04.797115  2934612 sandbox.go:1211] Signal sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:11:04.797126  2934612 sandbox.go:613] Connecting to sandbox "52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707"
D1222 10:11:04.797191  2934612 urpc.go:568] urpc: successfully marshalled 145 bytes.
D1222 10:11:04.874234  2934612 urpc.go:611] urpc: unmarshal success.
I1222 10:11:04.874304  2934612 main.go:224] Exiting with status: 0

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Here's the stacktrace of the sandbox in such a case. The 68min goroutines are when the kill signal was sent. Superficially, this looks to me as if the termination signal was indeed received but the sandbox is hung while shutting down.

Found sandbox "7aa7bce92e1e8639b62029ac4ce95b835cbb00b62cd99477ad7121e40c767fa3", PID: 14257
Retrieving sandbox stacks
     *** Stack dump ***
goroutine 2060216 [running]:
gvisor.dev/gvisor/pkg/log.Stacks(0x0?)
	pkg/log/log.go:319 +0x67
gvisor.dev/gvisor/runsc/boot.(*debug).Stacks(0x2, 0xc0006e4530?, 0xc00052a070)
	runsc/boot/debug.go:26 +0x1d
reflect.Value.call({0xc000615380?, 0xc0000e4538?, 0xc000163c40?}, {0x1225b9b, 0x4}, {0xc000163e90, 0x3, 0xc000163c70?})
	GOROOT/src/reflect/value.go:596 +0xce7
reflect.Value.Call({0xc000615380?, 0xc0000e4538?, 0x1eb8b40?}, {0xc000163e90?, 0x1eb8b40?, 0x16?})
	GOROOT/src/reflect/value.go:380 +0xb9
gvisor.dev/gvisor/pkg/urpc.(*Server).handleOne(0xc0000e21e0, 0xc000e18180)
	pkg/urpc/urpc.go:338 +0x4b9
gvisor.dev/gvisor/pkg/urpc.(*Server).handleRegistered(...)
	pkg/urpc/urpc.go:433
gvisor.dev/gvisor/pkg/urpc.(*Server).StartHandling.func1()
	pkg/urpc/urpc.go:453 +0x76
created by gvisor.dev/gvisor/pkg/urpc.(*Server).StartHandling in goroutine 39
	pkg/urpc/urpc.go:451 +0x6b

goroutine 1 [semacquire, 21680 minutes]:
sync.runtime_Semacquire(0xc000107608?)
	GOROOT/src/runtime/sema.go:62 +0x25
sync.(*WaitGroup).Wait(0x0?)
	GOROOT/src/sync/waitgroup.go:116 +0x48
gvisor.dev/gvisor/pkg/sentry/kernel.(*Kernel).WaitExited(...)
	pkg/sentry/kernel/kernel.go:1178
gvisor.dev/gvisor/runsc/boot.(*Loader).WaitExit(0xc0001ca400)
	runsc/boot/loader.go:1276 +0x28
gvisor.dev/gvisor/runsc/cmd.(*Boot).Execute(0xc0002e2000, {0xc0001220b0?, 0xc00015bac0?}, 0xc0001f1b90, {0xc00015bac0, 0x2, 0x1b?})
	runsc/cmd/boot.go:497 +0x18e5
github.com/google/subcommands.(*Commander).Execute(0xc00014c000, {0x144e880, 0x1eb8b40}, {0xc00015bac0, 0x2, 0x2})
	external/com_github_google_subcommands/subcommands.go:200 +0x38c
github.com/google/subcommands.Execute(...)
	external/com_github_google_subcommands/subcommands.go:481
gvisor.dev/gvisor/runsc/cli.Main()
	runsc/cli/main.go:219 +0x14b0
main.main()
	runsc/main.go:31 +0xf

goroutine 9 [sync.Cond.Wait, 21680 minutes]:
sync.runtime_notifyListWait(0xc0000aa6c8, 0x0)
	GOROOT/src/runtime/sema.go:527 +0x159
sync.(*Cond).Wait(0xc0000aa000?)
	GOROOT/src/sync/cond.go:70 +0x85
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).findReclaimable(0xc0000aa000)
	pkg/sentry/pgalloc/pgalloc.go:1436 +0xbd
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).runReclaim(0xc0000aa000)
	pkg/sentry/pgalloc/pgalloc.go:1345 +0x78
created by gvisor.dev/gvisor/pkg/sentry/pgalloc.NewMemoryFile in goroutine 1
	pkg/sentry/pgalloc/pgalloc.go:368 +0x27b

goroutine 10 [chan receive, 21441 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 1
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 11 [sleep]:
time.Sleep(0x61a80)
	GOROOT/src/runtime/time.go:195 +0x125
gvisor.dev/gvisor/pkg/sentry/platform/systrap.controlFastPath()
	pkg/sentry/platform/systrap/metrics.go:263 +0x18
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.New.func2 in goroutine 1
	pkg/sentry/platform/systrap/systrap.go:345 +0x1a

goroutine 12 [sync.Cond.Wait, 68 minutes]:
sync.runtime_notifyListWait(0xc0000aaec8, 0x3f1eb4)
	GOROOT/src/runtime/sema.go:527 +0x159
sync.(*Cond).Wait(0xc0000aa800?)
	GOROOT/src/sync/cond.go:70 +0x85
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).findReclaimable(0xc0000aa800)
	pkg/sentry/pgalloc/pgalloc.go:1436 +0xbd
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).runReclaim(0xc0000aa800)
	pkg/sentry/pgalloc/pgalloc.go:1345 +0x78
created by gvisor.dev/gvisor/pkg/sentry/pgalloc.NewMemoryFile in goroutine 1
	pkg/sentry/pgalloc/pgalloc.go:368 +0x27b

goroutine 13 [select]:
gvisor.dev/gvisor/pkg/sentry/kernel.(*Timekeeper).startUpdater.func1()
	pkg/sentry/kernel/timekeeper.go:254 +0x159
created by gvisor.dev/gvisor/pkg/sentry/kernel.(*Timekeeper).startUpdater in goroutine 1
	pkg/sentry/kernel/timekeeper.go:224 +0xd3

goroutine 14 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013ad98, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013ad80?, 0x0?, 0x87?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013ad80, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 15 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013ae28, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013ae10?, 0x0?, 0xce?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013ae10, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 16 [select, 70 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013aeb8, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013aea0?, 0x0?, 0xbc?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013aea0, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 33 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013af48, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013af30?, 0x0?, 0xa0?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013af30, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 34 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013afd8, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013afc0?, 0x0?, 0x9c?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013afc0, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 35 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013b068, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013b050?, 0x0?, 0xc7?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013b050, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 36 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013b0f8, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013b0e0?, 0x0?, 0x5c?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013b0e0, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 37 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc00013b188, 0x1, 0x1?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00013b170?, 0x0?, 0x63?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*processor).start(0xc00013b170, 0x0?)
	pkg/tcpip/transport/tcp/dispatcher.go:287 +0xb5
created by gvisor.dev/gvisor/pkg/tcpip/transport/tcp.(*dispatcher).init in goroutine 1
	pkg/tcpip/transport/tcp/dispatcher.go:391 +0x13d

goroutine 39 [syscall]:
syscall.Syscall6(0x0?, 0x0?, 0xffffffffffffffff?, 0x0?, 0xb?, 0xffffffffffffffff?, 0xc00071a6cc?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
gvisor.dev/gvisor/pkg/unet.(*Socket).wait(0xc0000007b0, 0x0)
	pkg/unet/unet_unsafe.go:53 +0x9b
gvisor.dev/gvisor/pkg/unet.(*ServerSocket).Accept(0xc0000e4050)
	pkg/unet/unet.go:517 +0x125
gvisor.dev/gvisor/pkg/control/server.(*Server).serve(0xc0000c9620)
	pkg/control/server/server.go:104 +0x39
gvisor.dev/gvisor/pkg/control/server.(*Server).StartServing.func1()
	pkg/control/server/server.go:92 +0x1c
created by gvisor.dev/gvisor/pkg/control/server.(*Server).StartServing in goroutine 1
	pkg/control/server/server.go:91 +0x85

goroutine 49 [select]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c96e0, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000816f78?, 0x0?, 0xe3?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc000268000)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 50 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c9700, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc0000dbf78?, 0xe0?, 0x24?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc000268070)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 51 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c9720, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000c06f78?, 0x20?, 0xa3?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc0002680e0)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 52 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c96c0, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000810f78?, 0x10?, 0xe3?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc000268150)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 53 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c9760, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000c03f78?, 0x60?, 0x47?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc0002681c0)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 54 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c9740, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000b2cf78?, 0x70?, 0x7?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc000268230)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 55 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c9780, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc000b29f78?, 0xc0?, 0xa7?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc0002682a0)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 56 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sync.Gopark(...)
	pkg/sync/runtime_unsafe.go:33
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).nextWaker(0xc0000c97a0, 0x1, 0x0?)
	pkg/sleep/sleep_unsafe.go:209 +0x7a
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).fetch(0xc00040df78?, 0xb0?, 0xa5?)
	pkg/sleep/sleep_unsafe.go:256 +0x2b
gvisor.dev/gvisor/pkg/sleep.(*Sleeper).Fetch(...)
	pkg/sleep/sleep_unsafe.go:279
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.(*queueDispatcher).dispatchLoop(0xc000268310)
	pkg/tcpip/link/qdisc/fifo/fifo.go:96 +0xcf
gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New.func1()
	pkg/tcpip/link/qdisc/fifo/fifo.go:82 +0x4f
created by gvisor.dev/gvisor/pkg/tcpip/link/qdisc/fifo.New in goroutine 40
	pkg/tcpip/link/qdisc/fifo/fifo.go:80 +0x9f

goroutine 65 [syscall]:
syscall.Syscall6(0xc000310218?, 0xc00056bdb0?, 0x47c94a?, 0x1e619c0?, 0xc000000800?, 0xc0012d8500?, 0xc00056bde0?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
gvisor.dev/gvisor/pkg/tcpip/link/rawfile.BlockingPollUntilStopped(0x14453a0?, 0xc00056be78?, 0x7af1?)
	pkg/tcpip/link/rawfile/rawfile_unsafe.go:248 +0x65
gvisor.dev/gvisor/pkg/tcpip/link/fdbased.(*packetMMapDispatcher).readMMappedPacket(0xc0000a03c0)
	pkg/tcpip/link/fdbased/mmap.go:140 +0x8f
gvisor.dev/gvisor/pkg/tcpip/link/fdbased.(*packetMMapDispatcher).dispatch(0xc0000a03c0)
	pkg/tcpip/link/fdbased/mmap.go:172 +0x45
gvisor.dev/gvisor/pkg/tcpip/link/fdbased.(*endpoint).dispatchLoop(0xc0001cc000, {0x144dcb0, 0xc0000a03c0})
	pkg/tcpip/link/fdbased/endpoint.go:755 +0x32
gvisor.dev/gvisor/pkg/tcpip/link/fdbased.(*endpoint).Attach.func1(0xc0000b3f88?)
	pkg/tcpip/link/fdbased/endpoint.go:430 +0x39
created by gvisor.dev/gvisor/pkg/tcpip/link/fdbased.(*endpoint).Attach in goroutine 40
	pkg/tcpip/link/fdbased/endpoint.go:429 +0x11e

goroutine 41 [syscall, 21680 minutes]:
syscall.Syscall6(0xc000271660?, 0x424c9c?, 0x1eba300?, 0x40fc7e?, 0x7fcea3d9e420?, 0x101010101010101?, 0x0?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
golang.org/x/sys/unix.ppoll(0x40e125?, 0x10?, 0x1169f20?, 0xc0002e83f0?)
	external/org_golang_x_sys/unix/zsyscall_linux.go:124 +0x57
golang.org/x/sys/unix.Ppoll({0xc000271760?, 0x40?, 0xc0002ee000?}, 0x0?, 0x0?)
	external/org_golang_x_sys/unix/syscall_linux.go:149 +0x38
gvisor.dev/gvisor/runsc/boot.(*Loader).startGoferMonitor.func1.1(...)
	runsc/boot/loader.go:1076
gvisor.dev/gvisor/runsc/specutils.RetryEintr(...)
	runsc/specutils/specutils.go:698
gvisor.dev/gvisor/runsc/boot.(*Loader).startGoferMonitor.func1()
	runsc/boot/loader.go:1074 +0xf3
created by gvisor.dev/gvisor/runsc/boot.(*Loader).startGoferMonitor in goroutine 1
	runsc/boot/loader.go:1066 +0x105

goroutine 42 [syscall, 21680 minutes]:
syscall.Syscall6(0x3?, 0xc00025c504?, 0x4?, 0xc00025c594?, 0x4?, 0xc00025c608?, 0x4?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
golang.org/x/sys/unix.ppoll(0x0?, 0x0?, 0x0?, 0x0?)
	external/org_golang_x_sys/unix/zsyscall_linux.go:124 +0x57
golang.org/x/sys/unix.Ppoll({0xc000271f70?, 0x1?, 0x0?}, 0x824?, 0xc00025ce9c?)
	external/org_golang_x_sys/unix/syscall_linux.go:149 +0x38
gvisor.dev/gvisor/pkg/lisafs.(*Client).watchdog(0xc00038c2c0)
	pkg/lisafs/client.go:172 +0x9f
created by gvisor.dev/gvisor/pkg/lisafs.NewClient in goroutine 1
	pkg/lisafs/client.go:84 +0x1c5

goroutine 113 [chan receive, 21680 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 100
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 66 [syscall, 21680 minutes]:
syscall.Syscall6(0xc00025c1b3?, 0x3?, 0xc00025c5d8?, 0x4?, 0xc00025cba4?, 0x4?, 0x0?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
golang.org/x/sys/unix.ppoll(0xc000272e78?, 0x88d160?, 0xc000272e2f?, 0x0?)
	external/org_golang_x_sys/unix/zsyscall_linux.go:124 +0x57
golang.org/x/sys/unix.Ppoll({0xc000272f70?, 0x0?, 0xc000272fd0?}, 0x865b45?, 0x24b7e1?)
	external/org_golang_x_sys/unix/syscall_linux.go:149 +0x38
gvisor.dev/gvisor/pkg/lisafs.(*Client).watchdog(0xc0001bca50)
	pkg/lisafs/client.go:172 +0x9f
created by gvisor.dev/gvisor/pkg/lisafs.NewClient in goroutine 1
	pkg/lisafs/client.go:84 +0x1c5

goroutine 47 [select, 21680 minutes]:
gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).runGoroutine(0xc00039e200)
	pkg/sentry/kernel/time/time.go:507 +0x6d
created by gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).init in goroutine 1
	pkg/sentry/kernel/time/time.go:487 +0x156

goroutine 97 [select, 21680 minutes]:
reflect.rselect({0xc00013a900, 0x22, 0x2?})
	GOROOT/src/runtime/select.go:589 +0x2c5
reflect.Select({0xc0000fc000?, 0x22, 0x0?})
	GOROOT/src/reflect/value.go:3104 +0x5ea
gvisor.dev/gvisor/pkg/sighandling.handleSignals({0xc0002ec200, 0x21, 0xc0001bca50?}, 0xc0004ea450, 0xc0003f31f0?, 0xc0003fdfb8?)
	pkg/sighandling/sighandling.go:44 +0x306
created by gvisor.dev/gvisor/pkg/sighandling.StartSignalForwarding in goroutine 1
	pkg/sighandling/sighandling.go:107 +0x229

goroutine 98 [select]:
gvisor.dev/gvisor/pkg/sentry/watchdog.(*Watchdog).loop(0xc00009e100)
	pkg/sentry/watchdog/watchdog.go:250 +0x7b
created by gvisor.dev/gvisor/pkg/sentry/watchdog.(*Watchdog).Start in goroutine 1
	pkg/sentry/watchdog/watchdog.go:206 +0x1cc

goroutine 81 [syscall, 21680 minutes]:
os/signal.signal_recv()
	GOROOT/src/runtime/sigqueue.go:152 +0x29
os/signal.loop()
	GOROOT/src/os/signal/signal_unix.go:23 +0x13
created by os/signal.Notify.func1.1 in goroutine 1
	GOROOT/src/os/signal/signal.go:151 +0x1f

goroutine 99 [sync.Cond.Wait, 68 minutes]:
sync.runtime_notifyListWait(0xc00033dcf8, 0x761d7b)
	GOROOT/src/runtime/sema.go:527 +0x159
sync.(*Cond).Wait(0xc000308000?)
	GOROOT/src/sync/cond.go:70 +0x85
gvisor.dev/gvisor/pkg/sentry/kernel.(*Kernel).runCPUClockTicker(0xc00033dc00)
	pkg/sentry/kernel/task_sched.go:349 +0x170
created by gvisor.dev/gvisor/pkg/sentry/kernel.(*Kernel).Start in goroutine 1
	pkg/sentry/kernel/kernel.go:1009 +0x179

goroutine 100 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).block(0xc000352a80, 0x0, 0x0)
	pkg/sentry/kernel/task_block.go:164 +0x14b
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).Block(...)
	pkg/sentry/kernel/task_block.go:119
gvisor.dev/gvisor/pkg/sentry/syscalls/linux.Pause(0xc00051b990?, 0x6d74bd?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
	pkg/sentry/syscalls/linux/sys_signal.go:333 +0x18
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).executeSyscall(0xc000352a80, 0x22, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
	pkg/sentry/kernel/task_syscall.go:142 +0x673
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscallInvoke(0xc000352a80, 0xc0001edc80?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
	pkg/sentry/kernel/task_syscall.go:322 +0x45
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscallEnter(0xc00051be00?, 0xcf4907?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
	pkg/sentry/kernel/task_syscall.go:282 +0x59
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscall(0xc0001c6340?)
	pkg/sentry/kernel/task_syscall.go:257 +0x2d5
gvisor.dev/gvisor/pkg/sentry/kernel.(*runApp).execute(0xc000352a80?, 0xc000352a80)
	pkg/sentry/kernel/task_run.go:269 +0xfb7
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).run(0xc000352a80, 0x1)
	pkg/sentry/kernel/task_run.go:98 +0x1ef
created by gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).Start in goroutine 1
	pkg/sentry/kernel/task_start.go:391 +0xe5

goroutine 82 [select, 21680 minutes]:
gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).runGoroutine(0xc00014c100)
	pkg/sentry/kernel/time/time.go:507 +0x6d
created by gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).init in goroutine 100
	pkg/sentry/kernel/time/time.go:487 +0x156

goroutine 59 [semacquire, 21680 minutes]:
sync.runtime_Semacquire(0x0?)
	GOROOT/src/runtime/sema.go:62 +0x25
sync.(*WaitGroup).Wait(0xc000619490?)
	GOROOT/src/sync/waitgroup.go:116 +0x48
gvisor.dev/gvisor/pkg/sentry/kernel.(*ThreadGroup).WaitExited(...)
	pkg/sentry/kernel/task_run.go:388
gvisor.dev/gvisor/runsc/boot.(*Loader).wait(0xc000619520?, 0xc0000f4000)
	runsc/boot/loader.go:1264 +0x25
gvisor.dev/gvisor/runsc/boot.(*Loader).waitContainer(0xc0001ca400, {0xc000154400, 0x40}, 0xc0003f3480)
	runsc/boot/loader.go:1210 +0xbe
gvisor.dev/gvisor/runsc/boot.(*containerManager).Wait(0xc000011a58, 0xc00037a130, 0xc0003f3480)
	runsc/boot/controller.go:585 +0xaf
reflect.Value.call({0xc000614b40?, 0xc0000e4290?, 0xc000619c40?}, {0x1225b9b, 0x4}, {0xc000619e90, 0x3, 0xc000619c70?})
	GOROOT/src/reflect/value.go:596 +0xce7
reflect.Value.Call({0xc000614b40?, 0xc0000e4290?, 0xc00037a130?}, {0xc000619e90?, 0xc00037a130?, 0x16?})
	GOROOT/src/reflect/value.go:380 +0xb9
gvisor.dev/gvisor/pkg/urpc.(*Server).handleOne(0xc0000e21e0, 0xc0001e6ba0)
	pkg/urpc/urpc.go:338 +0x4b9
gvisor.dev/gvisor/pkg/urpc.(*Server).handleRegistered(...)
	pkg/urpc/urpc.go:433
gvisor.dev/gvisor/pkg/urpc.(*Server).StartHandling.func1()
	pkg/urpc/urpc.go:453 +0x76
created by gvisor.dev/gvisor/pkg/urpc.(*Server).StartHandling in goroutine 39
	pkg/urpc/urpc.go:451 +0x6b

goroutine 2060231 [select]:
gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).runGoroutine(0xc0014a0500)
	pkg/sentry/kernel/time/time.go:507 +0x6d
created by gvisor.dev/gvisor/pkg/sentry/kernel/time.(*Timer).init in goroutine 65
	pkg/sentry/kernel/time/time.go:487 +0x156

goroutine 60 [syscall, 68 minutes]:
syscall.Syscall6(0xc000731aa0?, 0xc000a31f78?, 0x1e8f7d0?, 0x408e80?, 0xc000a31f20?, 0xc000204a50?, 0x409f1a?)
	GOROOT/src/syscall/syscall_linux.go:91 +0x30
gvisor.dev/gvisor/pkg/fdnotifier.epollWait(0xc000bdc520?, {0xc000204af0?, 0x275?, 0x1e875e0?}, 0xc0001e6cc0?)
	pkg/fdnotifier/poll_unsafe.go:77 +0x4d
gvisor.dev/gvisor/pkg/fdnotifier.(*notifier).waitAndNotify(0xc00038a3f0)
	pkg/fdnotifier/fdnotifier.go:149 +0x58
created by gvisor.dev/gvisor/pkg/fdnotifier.newNotifier in goroutine 115
	pkg/fdnotifier/fdnotifier.go:64 +0xb6

goroutine 75 [sync.Cond.Wait, 68 minutes]:
sync.runtime_notifyListWait(0xc00001f6c8, 0x1)
	GOROOT/src/runtime/sema.go:527 +0x159
sync.(*Cond).Wait(0xc00001f000?)
	GOROOT/src/sync/cond.go:70 +0x85
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).findReclaimable(0xc00001f000)
	pkg/sentry/pgalloc/pgalloc.go:1436 +0xbd
gvisor.dev/gvisor/pkg/sentry/pgalloc.(*MemoryFile).runReclaim(0xc00001f000)
	pkg/sentry/pgalloc/pgalloc.go:1345 +0x78
created by gvisor.dev/gvisor/pkg/sentry/pgalloc.NewMemoryFile in goroutine 115
	pkg/sentry/pgalloc/pgalloc.go:368 +0x27b

goroutine 145 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 122
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 31 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 210
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 147 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 181
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 126 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 124
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 30 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 28
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 162 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 128
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 146 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 124
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 195 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 193
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 138 [chan receive, 21679 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 137
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 13731 [chan receive, 21445 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13664
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 14033 [chan receive, 21441 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13966
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 13730 [chan receive, 21441 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13664
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 13921 [chan receive, 17118 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13882
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 14003 [chan receive, 21441 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13983
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

goroutine 13646 [chan receive, 21445 minutes]:
gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess.func1()
	pkg/sentry/platform/systrap/subprocess.go:316 +0x54
created by gvisor.dev/gvisor/pkg/sentry/platform/systrap.newSubprocess in goroutine 13713
	pkg/sentry/platform/systrap/subprocess.go:313 +0x265

from gvisor.

avagin avatar avagin commented on August 28, 2024

The sandbox has one alive process:

goroutine 100 [select, 68 minutes]:
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).block(0xc000352a80, 0x0, 0x0)
        pkg/sentry/kernel/task_block.go:164 +0x14b
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).Block(...)
        pkg/sentry/kernel/task_block.go:119
gvisor.dev/gvisor/pkg/sentry/syscalls/linux.Pause(0xc00051b990?, 0x6d74bd?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
        pkg/sentry/syscalls/linux/sys_signal.go:333 +0x18
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).executeSyscall(0xc000352a80, 0x22, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
        pkg/sentry/kernel/task_syscall.go:142 +0x673
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscallInvoke(0xc000352a80, 0xc0001edc80?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
        pkg/sentry/kernel/task_syscall.go:322 +0x45
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscallEnter(0xc00051be00?, 0xcf4907?, {{0x11}, {0x7eea20872b20}, {0x0}, {0x8}, {0x0}, {0x0}})
        pkg/sentry/kernel/task_syscall.go:282 +0x59
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).doSyscall(0xc0001c6340?)
        pkg/sentry/kernel/task_syscall.go:257 +0x2d5
gvisor.dev/gvisor/pkg/sentry/kernel.(*runApp).execute(0xc000352a80?, 0xc000352a80)
        pkg/sentry/kernel/task_run.go:269 +0xfb7
gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).run(0xc000352a80, 0x1)
        pkg/sentry/kernel/task_run.go:98 +0x1ef
created by gvisor.dev/gvisor/pkg/sentry/kernel.(*Task).Start in goroutine 1
        pkg/sentry/kernel/task_start.go:391 +0xe5

from gvisor.

avagin avatar avagin commented on August 28, 2024

Both runsc kill (first to the container, then to the sandbox directly) did nothing to actually stop either container or sandbox.

D1222 10:11:04.797094  2934612 container.go:673] Signal container, cid: 52c47b6ff1867b52eca5babb8920c390577d96bb287d3c2170fc37a37415c707, signal: terminated (15)

runsc kill sends SETERM by default. This signal can be handled by a process. You can try to send SIGKILL.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@avagin thanks for chiming in.

After your comments, I made a few more tests around SIGTERM handling to see if there's any regression here, but no: Even if I handle SIGTERM and then ignore it in the respective container, eventually (after the terminationGracePeriod) SIGKILL will be sent via Kubernetes/containerd. However, that seems to never reach the process in question and thus it's stuck.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

With a bit more digging, I think what I'm looking at is that the container has been successfully removed since its processes are gone (including its state file as mentioned by the logs above) but the actual processes of the container are running via the sandbox. They are holding onto the containers rootfs though and since nothing, seemingly, ever tried to remove the sandbox because final removal of the container already failed, we're locked.

The delete is then retried over and over by containerd, we keep alerting on a missing state file because we've already successfully deleted the container and the only contention left is that the processes running under the sandbox keep a hold of the rootfs.

Sorry for the probably very imprecise explanation. I'm building a mental model as I go here 😅

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@manninglucas any chance that the two rather recent commits 3ab01ae and 6a112c6 have anything to do with this?

from gvisor.

avagin avatar avagin commented on August 28, 2024

@markusthoemmes could you reproduce the issue and show outputs of 'ls -l /proc/PID/fd' for runsc processes?

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@avagin here we go. Sadly, as noted above, the "inner" container correctly stops and so there's no processes left that I could check there. All the python fds on the sandbox are part of the rootfs of the container image.

FWIW, both /run/containerd/io.containerd.runtime.v2.task/k8s.io/5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f/rootfs and /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/rootfs are empty on ls -l


sandbox: 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
container: 5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f

$ ps aux | grep 5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f
# Already gone as noted above.
$ ps aux | grep 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
root     2392488  0.0  0.1 1261928 21736 ?       Ssl  Jan02   0:00 runsc-gofer --systemd-cgroup=true --directfs=false --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --platform=systrap --log-fd=3 gofer --bundle=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 --gofer-mount-confs=lisafs:none,lisafs:none --io-fds=6,7 --mounts-fd=5 --spec-fd=4 --sync-nvproxy-fd=-1 --sync-userns-fd=-1 --proc-mount-sync-fd=14 --apply-caps=false --setup-root=false
nobody   2392492  2.1  0.4 3547332 77984 ?       Ssl  Jan02  67:32 runsc-sandbox --platform=systrap --systemd-cgroup=true --directfs=false --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --log-fd=3 --panic-log-fd=4 boot --bundle=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 --controller-fd=10 --cpu-num=8 --dev-io-fd=-1 --gofer-mount-confs=lisafs:none,lisafs:none --io-fds=5,6 --mounts-fd=7 --setup-root=false --spec-fd=11 --start-sync-fd=8 --stdio-fds=12,13,14 --total-host-memory=16768503808 --total-memory=1073741824 --user-log-fd=9 --product-name=<redacted> --proc-mount-sync-fd=21 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
root     2392576  0.0  0.1 1253480 20360 ?       Sl   Jan02   0:00 runsc --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --systemd-cgroup=true --platform=systrap --directfs=false wait 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
$ ls -l /proc/2392488/fd 
lr-x------ 1 root root 64 Jan  3 18:23 0 -> /dev/null
l-wx------ 1 root root 64 Jan  3 18:23 1 -> /dev/null
lrwx------ 1 root root 64 Jan  3 18:23 10 -> 'anon_inode:[eventpoll]'
lr-x------ 1 root root 64 Jan  3 18:23 11 -> 'pipe:[1725863547]'
l-wx------ 1 root root 64 Jan  3 18:23 12 -> 'pipe:[1725864402]'
l-wx------ 1 root root 64 Jan  2 17:19 13 -> 'pipe:[1725863547]'
lrwx------ 1 root root 64 Jan  3 18:23 14 -> 'anon_inode:[eventfd]'
l-wx------ 1 root root 64 Jan  3 18:23 15 -> 'pipe:[1725863550]'
lr-x------ 1 root root 64 Jan  3 18:23 16 -> /
lrwx------ 1 root root 64 Jan  3 18:23 17 -> '/memfd:flipcall_packet_windows (deleted)'
lrwx------ 1 root root 64 Jan  3 18:23 18 -> '/memfd:flipcall_packet_windows (deleted)'
lr-x------ 1 root root 64 Jan  3 18:23 19 -> /root
l-wx------ 1 root root 64 Jan  3 18:23 2 -> /dev/null
lrwx------ 1 root root 64 Jan  3 18:23 20 -> 'socket:[1725866577]'
lrwx------ 1 root root 64 Jan  3 18:23 21 -> 'socket:[1725866579]'
lr-x------ 1 root root 64 Jan  3 18:23 22 -> /root/etc
lr-x------ 1 root root 64 Jan  3 18:23 23 -> /root/etc/resolv.conf
lr-x------ 1 root root 64 Jan  3 18:23 24 -> /root/etc/resolv.conf
lrwx------ 1 root root 64 Jan  3 18:23 25 -> 'socket:[1725863574]'
lrwx------ 1 root root 64 Jan  3 18:23 26 -> 'socket:[1725863579]'
lr-x------ 1 root root 64 Jan  3 18:23 27 -> /root/pause
lr-x------ 1 root root 64 Jan  3 18:23 28 -> /root/pause
l-wx------ 1 root root 64 Jan  3 18:23 3 -> /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json
lr-x------ 1 root root 64 Jan  3 18:23 4 -> /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/config.json
lrwx------ 1 root root 64 Jan  3 18:23 5 -> 'anon_inode:[eventfd]'
lrwx------ 1 root root 64 Jan  3 18:23 6 -> 'socket:[1725867294]'
lrwx------ 1 root root 64 Jan  3 18:23 7 -> 'socket:[1725867296]'
lr-x------ 1 root root 64 Jan  3 18:23 8 -> 'pipe:[1725864402]'
lr-x------ 1 root root 64 Jan  3 18:23 9 -> 'pipe:[1725863550]'
$ ls -l /proc/2392492/fd
total 0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 0 -> /dev/null
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 1 -> /dev/null
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 10 -> 'socket:[1725867313]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 100 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/dumper.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 101 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/openapi.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 102 -> /usr/local/lib/python3.8/site-packages/inflection/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 103 -> /usr/local/lib/python3.8/site-packages/_cffi_backend.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 104 -> /usr/local/lib/python3.8/site-packages/rest_framework/templatetags/__pycache__/rest_framework.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 105 -> /usr/local/lib/python3.8/site-packages/drf_yasg/inspectors/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 106 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/versioning.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 107 -> /usr/local/lib/python3.8/lib-dynload/_posixsubprocess.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 108 -> /usr/local/lib/python3.8/lib-dynload/select.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 109 -> /usr/local/lib/python3.8/lib-dynload/_ctypes.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 11 -> '/memfd:systrap-memory (deleted)'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 110 -> /usr/local/lib/python3.8/site-packages/drf_yasg/inspectors/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 111 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/errors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 112 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 113 -> /usr/local/lib/python3.8/site-packages/django/contrib/staticfiles/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 114 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/generators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 115 -> /usr/local/lib/python3.8/lib-dynload/pyexpat.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 116 -> /usr/lib/libexpat.so.1.6.11
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 117 -> /usr/local/lib/python3.8/site-packages/packaging/__pycache__/version.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 118 -> /usr/local/lib/python3.8/site-packages/packaging/__pycache__/_structures.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 119 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/__init__.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 12 -> '/memfd:runsc-memory (deleted)'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 120 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/fields.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 121 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/connection.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 122 -> /usr/local/lib/python3.8/site-packages/dal_select2/__pycache__/fields.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 124 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/_collections.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 125 -> /usr/local/lib/python3.8/site-packages/drf_yasg/inspectors/__pycache__/query.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 126 -> /usr/local/lib/python3.8/site-packages/drf_yasg/inspectors/__pycache__/view.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 127 -> /usr/local/lib/python3.8/site-packages/django/contrib/contenttypes/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 128 -> /usr/local/lib/python3.8/site-packages/jet/__pycache__/urls.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 129 -> /usr/local/lib/python3.8/lib-dynload/zlib.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 13 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 130 -> /lib/libz.so.1.2.11
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 131 -> /usr/local/lib/python3.8/site-packages/coreapi/transports/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 132 -> /usr/local/lib/python3.8/site-packages/jet/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 133 -> /usr/local/lib/python3.8/lib-dynload/_bz2.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 134 -> /usr/lib/libbz2.so.1.0.8
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 135 -> /usr/local/lib/python3.8/site-packages/django/middleware/__pycache__/http.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 136 -> /usr/local/lib/python3.8/lib-dynload/_lzma.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 137 -> /usr/lib/liblzma.so.5.2.4
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 138 -> /usr/local/lib/python3.8/lib-dynload/grp.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 139 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/syntax_color.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  2 17:19 14 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 140 -> /usr/lib/libpoppler.so.92.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 141 -> /usr/lib/libjson-c.so.4.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 142 -> /usr/lib/libgeos_c.so.1.13.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 143 -> /usr/local/lib/python3.8/site-packages/coreapi/transports/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 144 -> /usr/lib/libodbcinst.so.2.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 145 -> /usr/lib/libtiff.so.5.5.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 147 -> /usr/local/lib/python3.8/lib-dynload/_opcode.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 148 -> /usr/lib/libxml2.so.2.9.12
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 149 -> /usr/lib/libstdc++.so.6.0.28
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 15 -> 'pipe:[1725867330]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 150 -> /usr/lib/libfontconfig.so.1.12.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 151 -> /usr/lib/libicuuc.so.64.2
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 152 -> /usr/local/lib/python3.8/site-packages/django/templatetags/__pycache__/cache.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 153 -> /usr/local/lib/python3.8/site-packages/django/templatetags/__pycache__/l10n.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 154 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/templatetags/__pycache__/jet_dashboard_tags.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 155 -> /usr/local/lib/python3.8/site-packages/pdfkit/__pycache__/api.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 156 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/urls.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 157 -> /usr/local/lib/python3.8/site-packages/uritemplate/__pycache__/api.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 158 -> /usr/local/lib/python3.8/site-packages/uritemplate/__pycache__/orderedset.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 159 -> /usr/local/lib/python3.8/site-packages/jet/templatetags/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 16 -> 'pipe:[1725867342]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 160 -> /usr/local/lib/python3.8/site-packages/jet/templatetags/__pycache__/jet_tags.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 161 -> /usr/local/lib/python3.8/site-packages/braces/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 162 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 163 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/templatetags/__pycache__/admin_list.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 164 -> /usr/local/lib/python3.8/lib-dynload/_socket.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 165 -> /usr/local/lib/python3.8/site-packages/uritemplate/__pycache__/template.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 166 -> /usr/local/lib/python3.8/lib-dynload/_ssl.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 167 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/_ajax.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 168 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/views/__pycache__/main.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 169 -> /usr/local/lib/python3.8/lib-dynload/_struct.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 17 -> 'anon_inode:[eventpoll]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 170 -> /usr/local/lib/python3.8/lib-dynload/binascii.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 171 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/client.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 172 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 173 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/core.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 174 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/util.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 175 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/preprocessors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 176 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/inlinepatterns.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 177 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/_other.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 178 -> /usr/local/lib/python3.8/lib-dynload/_contextvars.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 179 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/_queries.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 18 -> 'pipe:[1725867337]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 180 -> /usr/local/lib/python3.8/lib-dynload/_asyncio.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 181 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/templatetags/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 182 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/templatetags/__pycache__/admin_modify.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 183 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/_internal_utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 184 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/templatetags/__pycache__/log.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 185 -> /usr/local/lib/python3.8/site-packages/django/core/cache/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 186 -> /usr/local/lib/python3.8/site-packages/django/templatetags/__pycache__/tz.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 187 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 188 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/templatetags/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 189 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/serializer_helpers.cpython-38.pyc
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 19 -> 'pipe:[1725867330]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 190 -> /usr/local/lib/python3.8/site-packages/rangefilter/templatetags/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 191 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/status.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 192 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/json.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 193 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/fields.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 194 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 195 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/treeprocessors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 196 -> /usr/lib/libgobject-2.0.so.0.6200.6
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 197 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/html.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 198 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/humanize_datetime.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 199 -> /usr/lib/libglib-2.0.so.0.6200.6
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 2 -> /var/log/pods/<redacted>/gvisor_panic.log
l-wx------ 1 nobody nogroup 64 Jan  2 17:19 20 -> 'pipe:[1725867337]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 200 -> /usr/lib/libpango-1.0.so.0.4400.7
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 201 -> /usr/lib/libfribidi.so.0.4.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 202 -> /usr/local/lib/python3.8/lib-dynload/_queue.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 203 -> /usr/lib/libharfbuzz.so.0.20600.4
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 204 -> /usr/lib/libgraphite2.so.3.2.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 205 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/poolmanager.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 206 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/formatting.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 207 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/validators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 208 -> /usr/local/lib/python3.8/lib-dynload/_bisect.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 209 -> /usr/local/lib/python3.8/lib-dynload/_sha512.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 21 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 210 -> /usr/local/lib/python3.8/lib-dynload/_random.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 211 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/field_mapping.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 212 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/relations.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 213 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/reverse.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 214 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/pagination.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 215 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/response.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 216 -> /usr/local/lib/python3.8/site-packages/jet/__pycache__/admin.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 217 -> /usr/local/lib/python3.8/site-packages/django/contrib/gis/admin/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 218 -> /usr/local/lib/python3.8/site-packages/django/contrib/gis/admin/__pycache__/options.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 219 -> /usr/local/lib/python3.8/site-packages/django/contrib/gis/admin/__pycache__/widgets.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 22 -> 'socket:[1725867313]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 220 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/admin.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 221 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 222 -> /usr/local/lib/python3.8/site-packages/rangefilter/templatetags/__pycache__/rangefilter_compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 223 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 224 -> /usr/local/lib/python3.8/site-packages/django/contrib/contenttypes/__pycache__/admin.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 225 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/debugger_tags.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 226 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/highlighting.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 227 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/indent_text.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 228 -> /usr/local/lib/python3.8/site-packages/django_extensions/templatetags/__pycache__/widont.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 229 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/templatetags/__pycache__/__init__.cpython-38.pyc
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 23 -> 'pipe:[1725867342]'
lr-x------ 1 nobody nogroup 64 Jan  3 21:43 232 -> /etc/passwd
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 237 -> /usr/local/lib/python3.8/site-packages/idna/__pycache__/intranges.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 238 -> /usr/local/lib/python3.8/lib-dynload/_hashlib.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 239 -> /usr/local/lib/python3.8/lib-dynload/_blake2.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 24 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 240 -> /usr/local/lib/python3.8/lib-dynload/_sha3.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 241 -> /usr/local/lib/python3.8/site-packages/idna/__pycache__/idnadata.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 242 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 243 -> /usr/local/lib/python3.8/lib-dynload/_pickle.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 244 -> /usr/local/lib/python3.8/site-packages/certifi/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 245 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/certs.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 249 -> /usr/local/lib/python3.8/lib-dynload/fcntl.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  2 17:19 25 -> 'socket:[1725869537]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 256 -> /dev/null
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 257 -> /dev/null
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 258 -> /dev/null
lrwx------ 1 nobody nogroup 64 Jan  2 17:19 26 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 261 -> /usr/local/lib/python3.8/site-packages/django/contrib/contenttypes/__pycache__/fields.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 262 -> /usr/local/lib/python3.8/site-packages/django/contrib/contenttypes/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 266 -> /usr/local/lib/python3.8/lib-dynload/_decimal.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 268 -> /usr/local/lib/python3.8/site-packages/django_extensions/admin/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 269 -> /usr/local/lib/python3.8/lib-dynload/termios.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  2 17:19 27 -> 'anon_inode:[eventfd]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 270 -> /usr/local/lib/python3.8/site-packages/django_extensions/admin/__pycache__/widgets.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 271 -> /usr/local/lib/python3.8/site-packages/constance/__pycache__/admin.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 273 -> /usr/local/lib/python3.8/site-packages/django_celery_beat/__pycache__/admin.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 274 -> /usr/local/lib/python3.8/site-packages/django_celery_beat/__pycache__/signals.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 28 -> 'socket:[1725866578]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 280 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/__pycache__/toolbar.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 281 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 282 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 283 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/history/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 284 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/history/__pycache__/panel.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 285 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/history/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 286 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/__pycache__/decorators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 287 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/history/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 288 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/versions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 289 -> /usr/local/lib/python3.8/lib-dynload/_csv.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 29 -> /root/pause
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 293 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/timer.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 297 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/settings.cpython-38.pyc
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 3 -> /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 30 -> 'socket:[1725866580]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 304 -> /usr/local/lib/python3.8/site-packages/certifi/__pycache__/core.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 305 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/cookies.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 308 -> /usr/local/lib/python3.8/site-packages/backports/zoneinfo/_czoneinfo.cpython-38-x86_64-linux-gnu.so
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 31 -> 'socket:[1725863575]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 311 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/headers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 313 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/request.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 314 -> /usr/local/lib/python3.8/lib-dynload/_uuid.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 315 -> /lib/libuuid.so.1.3.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 316 -> /usr/local/lib/python3.8/lib-dynload/array.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 318 -> /usr/local/lib/python3.8/site-packages/fontTools/misc/bezierTools.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 319 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/main.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 32 -> 'socket:[1725863580]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 320 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/structures.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 321 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/api.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 322 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/sessions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 323 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/adapters.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 324 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/auth.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 325 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/models.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 326 -> /usr/local/lib/python3.8/encodings/idna.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 327 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/hooks.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 328 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/filepost.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 329 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/status_codes.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 33 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 330 -> /usr/local/lib/python3.8/site-packages/urllib3/contrib/__pycache__/socks.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 331 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 332 -> /usr/local/lib/python3.8/site-packages/idna/__pycache__/package_data.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 333 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/constants.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 334 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 335 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/util.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 336 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/LUT.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 337 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/url.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 338 -> /usr/local/lib/python3.8/lib-dynload/resource.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 339 -> /usr/local/lib/python3.8/site-packages/qrcode/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 34 -> /lib/libssl.so.1.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 340 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/api.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 341 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/connection.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 342 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/response.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 343 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/_request_methods.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 344 -> /usr/local/lib/python3.8/site-packages/qrcode/image/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 345 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/ssltransport.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 346 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/connectionpool.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 347 -> /usr/local/lib/python3.8/site-packages/qrcode/image/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 348 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/ssl_match_hostname.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 349 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 35 -> /lib/libcrypto.so.1.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 350 -> /usr/local/lib/python3.8/site-packages/qrcode/image/styles/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 351 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/retry.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 352 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/ssl_.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 353 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 354 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/wait.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 355 -> /usr/local/lib/python3.8/site-packages/qrcode/image/styles/moduledrawers/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 356 -> /usr/local/lib/python3.8/site-packages/qrcode/image/styles/moduledrawers/__pycache__/pil.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 357 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/_version.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 358 -> /usr/local/lib/python3.8/site-packages/qrcode/compat/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 359 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/util.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 360 -> /usr/local/lib/python3.8/site-packages/qrcode/compat/__pycache__/pil.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 361 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/request.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 362 -> /usr/local/lib/python3.8/site-packages/PIL/__pycache__/ImageDraw.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 363 -> /usr/local/lib/python3.8/site-packages/qrcode/image/styles/moduledrawers/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 364 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 365 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/panel.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 366 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 37 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 370 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 372 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 374 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/__pycache__/serializers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 375 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/storage/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 376 -> /usr/local/lib/python3.8/lib-dynload/_json.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 377 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 387 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 388 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 389 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/timeout.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 39 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/coreapi.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 390 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/response.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 391 -> /usr/local/lib/python3.8/site-packages/urllib3/util/__pycache__/proxy.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 392 -> /usr/local/lib/python3.8/lib-dynload/_elementtree.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 393 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/cd.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 394 -> /usr/local/lib/python3.8/site-packages/qrcode/image/__pycache__/pure.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 395 -> /usr/local/lib/python3.8/site-packages/PIL/_imaging.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 396 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libjpeg-6aa56261.so.62.3.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 397 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libopenjp2-91bd27bd.so.2.5.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 398 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libtiff-9c086da0.so.6.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 399 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libxcb-7e4cbcc5.so.1.1.0
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 4 -> /var/log/pods/<redacted>/gvisor_panic.log
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 40 -> /usr/local/lib/python3.8/site-packages/corsheaders/__pycache__/signals.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 400 -> /usr/local/lib/python3.8/site-packages/__pycache__/png.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 401 -> /
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 402 -> /
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 403 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/serializers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 404 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 405 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 406 -> /usr/local/lib/python3.8/site-packages/_brotli.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 407 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 408 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/array.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 409 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/__pycache__/lookups.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 41 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/__pycache__/middleware.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 410 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/__pycache__/search.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 411 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/forms/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 412 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/forms/__pycache__/array.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 413 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/__pycache__/validators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 414 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 415 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/forms/__pycache__/hstore.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 416 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/forms/__pycache__/jsonb.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 417 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/forms/__pycache__/ranges.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 418 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 419 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/citext.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 42 -> /usr/local/bin/python3.8
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 420 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/constant.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 421 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/hstore.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 422 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/jsonb.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 423 -> /usr/local/lib/python3.8/site-packages/django/contrib/postgres/fields/__pycache__/ranges.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 424 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 425 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/auth.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 426 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 427 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 428 -> /usr/share/zoneinfo/America/Argentina/Buenos_Aires
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 429 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 43 -> /usr/local/lib/libpython3.8.so.1.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 430 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 431 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 432 -> /usr/local/lib/python3.8/site-packages/__pycache__/itypes.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 433 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/corejson.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 434 -> /usr/local/lib/python3.8/site-packages/coreapi/__pycache__/document.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 435 -> /usr/local/lib/python3.8/site-packages/coreschema/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 436 -> /usr/local/lib/python3.8/site-packages/coreschema/__pycache__/schemas.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 437 -> /usr/local/lib/python3.8/site-packages/coreschema/__pycache__/compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 438 -> /usr/local/lib/python3.8/site-packages/coreschema/__pycache__/formats.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 439 -> /usr/local/lib/python3.8/site-packages/coreschema/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 440 -> /usr/local/lib/python3.8/site-packages/coreschema/encodings/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 441 -> /usr/local/lib/python3.8/site-packages/coreschema/encodings/__pycache__/html.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 442 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/display.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 443 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/download.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 444 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/python.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 445 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/text.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 446 -> /usr/local/lib/python3.8/site-packages/coreapi/transports/__pycache__/http.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 447 -> /usr/local/lib/python3.8/site-packages/uritemplate/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 448 -> /usr/local/lib/python3.8/site-packages/uritemplate/__pycache__/variable.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 449 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/htmlparser.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 450 -> /usr/local/lib/python3.8/html/parser.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 451 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/blockprocessors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 452 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/postprocessors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 453 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/__meta__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 454 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 455 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/sql/__pycache__/tracking.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 456 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/staticfiles.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 457 -> /usr/local/lib/python3.8/site-packages/django/contrib/staticfiles/__pycache__/storage.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 458 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/storage/__pycache__/cookie.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 459 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/storage/__pycache__/session.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 460 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/templates/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 461 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/cache.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 462 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 463 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/generics.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 464 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/profiling.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 465 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/mixins.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 466 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/request.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 467 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/encoders.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 468 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/breadcrumbs.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 469 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/authentication.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 47 -> '/run/containerd/io.containerd.runtime.v2.task/k8s.io/5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f/rootfs/.gvisor.filestore.90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 (deleted)'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 470 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/__pycache__/authentication.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 471 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 472 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/__pycache__/middleware.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 473 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/backends.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 474 -> /usr/local/lib/python3.8/lib-dynload/unicodedata.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 475 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/permissions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 476 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/parsers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 477 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/negotiation.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 478 -> /usr/local/lib/python3.8/site-packages/markupsafe/_speedups.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 479 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/mediatypes.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 480 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/metadata.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 481 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/__pycache__/serializers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 482 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/__pycache__/tokens.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 483 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/token_blacklist/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 484 -> /usr/local/lib/python3.8/site-packages/rest_framework_simplejwt/token_blacklist/__pycache__/models.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 485 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 486 -> /usr/local/lib/python3.8/site-packages/django/views/decorators/__pycache__/vary.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 487 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/app_settings.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 488 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/renderers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 489 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/codecs.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 490 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 491 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/cyaml.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 492 -> /usr/local/lib/python3.8/site-packages/_ruamel_yaml.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 493 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/error.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 494 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/reader.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 495 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/util.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 496 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/compat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 497 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/docinfo.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 498 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/scanner.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 499 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/tokens.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 5 -> 'socket:[1725867293]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 500 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/parser.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 501 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/events.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 502 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/tag.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 503 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/comments.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 504 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/scalarstring.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 505 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/anchor.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 506 -> /usr/local/lib/python3.8/site-packages/psycopg2/_psycopg.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 507 -> /usr/lib/libpq.so.5.12
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 508 -> /usr/lib/libldap_r-2.4.so.2.10.11
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 509 -> /usr/lib/liblber-2.4.so.2.10.11
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 510 -> /usr/lib/libsasl2.so.3.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 511 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/composer.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 512 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/nodes.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 513 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/constructor.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 514 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/scalarint.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 515 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/main.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 516 -> /usr/local/lib/python3.8/site-packages/drf_yasg/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 517 -> /usr/local/lib/python3.8/site-packages/django/contrib/staticfiles/__pycache__/urls.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 518 -> /usr/local/lib/python3.8/site-packages/rest_framework/templatetags/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 519 -> /usr/local/lib/python3.8/site-packages/drf_yasg/inspectors/__pycache__/field.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 520 -> /usr/local/lib/python3.8/site-packages/packaging/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 521 -> /usr/local/lib/python3.8/site-packages/django/views/decorators/__pycache__/http.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 522 -> /usr/local/lib/python3.8/site-packages/jet/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 523 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/urls.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 524 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/dashboard.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 525 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/modules.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 526 -> /usr/local/lib/python3.8/site-packages/jet/__pycache__/ordered_set.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 527 -> /usr/local/lib/python3.8/site-packages/django/template/__pycache__/context_processors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 528 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/md.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 529 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/views.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 53 -> 'anon_inode:[eventpoll]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 530 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 531 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 532 -> /usr/local/lib/python3.8/site-packages/jet/dashboard/__pycache__/settings.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 533 -> /usr/local/lib/python3.8/site-packages/pdfkit/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 534 -> /usr/local/lib/python3.8/site-packages/pdfkit/__pycache__/pdfkit.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 535 -> /usr/local/lib/python3.8/site-packages/pdfkit/__pycache__/source.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 536 -> /usr/local/lib/python3.8/site-packages/pdfkit/__pycache__/configuration.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 537 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 538 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/routers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 539 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 540 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/urlpatterns.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 541 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/_access.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 542 -> /usr/lib/libffi.so.6.0.4
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 543 -> /usr/local/lib/python3.8/site-packages/braces/views/__pycache__/_forms.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 544 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/viewsets.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 545 -> /usr/local/lib/python3.8/site-packages/django/template/loaders/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 546 -> /usr/local/lib/python3.8/site-packages/django/template/loaders/__pycache__/filesystem.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 547 -> /usr/local/lib/python3.8/site-packages/django/template/loaders/__pycache__/app_directories.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 548 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/decorators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 549 -> /usr/local/lib/python3.8/site-packages/django/template/loaders/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 552 -> /usr/local/lib/python3.8/site-packages/django/template/loaders/__pycache__/cached.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 554 -> /usr/local/lib/python3.8/site-packages/fontTools/varLib/iup.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 21:43 555 -> /bin/busybox
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 556 -> /usr/lib/libgdal.so.26.0.3
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 557 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/md__mypyc.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 558 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 559 -> /usr/local/lib/python3.8/lib-dynload/_multibytecodec.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 560 -> /usr/lib/libgcc_s.so.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 564 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/models.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 565 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/legacy.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 566 -> /usr/lib/libwebp.so.7.0.5
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 567 -> /usr/lib/libsqlite3.so.0.8.6
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 568 -> /usr/lib/libodbc.so.2.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 569 -> /usr/lib/libxerces-c-3.2.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 570 -> /usr/lib/libopenjp2.so.2.4.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 571 -> /usr/lib/libgif.so.7.2.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 572 -> /usr/lib/libjpeg.so.8.2.2
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 573 -> /usr/lib/libpng16.so.16.37.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 574 -> /usr/lib/libproj.so.15.2.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 575 -> /usr/lib/libpcre.so.1.2.11
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 576 -> /usr/lib/libcurl.so.4.7.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 577 -> /usr/lib/libmariadb.so.3
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 578 -> /usr/lib/libfreetype.so.6.17.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 579 -> /usr/lib/liblcms2.so.2.0.8
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 580 -> /usr/lib/libgeos-3.8.0.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 581 -> /usr/lib/libnghttp2.so.14.19.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 582 -> /usr/lib/libicudata.so.64.2
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 591 -> /usr/local/lib/python3.8/site-packages/charset_normalizer/__pycache__/version.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 592 -> /usr/local/lib/python3.8/http/cookiejar.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 593 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/packages.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 594 -> /usr/local/lib/python3.8/site-packages/idna/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 595 -> /usr/local/lib/python3.8/site-packages/idna/__pycache__/core.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 596 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 597 -> /usr/local/lib/python3.8/stringprep.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 598 -> /usr/local/lib/python3.8/site-packages/urllib3/contrib/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 599 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/blockparser.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 6 -> 'socket:[1725867295]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 600 -> /usr/local/lib/python3.8/site-packages/markdown/extensions/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 601 -> /usr/local/lib/python3.8/site-packages/markdown/__pycache__/serializers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 602 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/representation.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 603 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/model_meta.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 604 -> /usr/local/lib/python3.8/site-packages/rest_framework/utils/__pycache__/urls.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 605 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/tokens.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 606 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/templates/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 607 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/templates/__pycache__/panel.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 608 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/signals.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 609 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/logging.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 610 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/panels/__pycache__/redirects.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 611 -> /usr/local/lib/python3.8/cProfile.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 612 -> /usr/local/lib/python3.8/lib-dynload/_lsprof.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 613 -> /usr/local/lib/python3.8/profile.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 614 -> /usr/local/lib/python3.8/pstats.py
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 615 -> /usr/local/lib/python3.8/site-packages/debug_toolbar/__pycache__/middleware.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 616 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/liblzma-78232b57.so.5.4.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 617 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libXau-52a9ca8d.so.6.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 618 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libXdmcp-4a845f28.so.6.0.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 619 -> /usr/local/lib/python3.8/site-packages/Pillow.libs/libbsd-ac21d37f.so.0.10.0
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 620 -> /usr/local/lib/python3.8/site-packages/django/middleware/__pycache__/clickjacking.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 663 -> /usr/local/lib/python3.8/site-packages/django/contrib/auth/__pycache__/middleware.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 664 -> /usr/local/lib/python3.8/site-packages/django/middleware/__pycache__/common.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 665 -> /usr/local/lib/python3.8/site-packages/corsheaders/__pycache__/middleware.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 67 -> /usr/local/lib/python3.8/site-packages/requests/__pycache__/__version__.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  3 18:23 7 -> '/memfd:memory-usage (deleted)'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 70 -> /usr/lib/libpangoft2-1.0.so.0.4400.7
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 71 -> /lib/ld-musl-x86_64.so.1
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 72 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/backends/__pycache__/__init__.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 73 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/backends/__pycache__/base.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 74 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/__pycache__/exceptions.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 75 -> /usr/local/lib/python3.8/site-packages/django/contrib/sessions/backends/__pycache__/db.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 76 -> /usr/local/lib/python3.8/site-packages/django/middleware/__pycache__/security.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 77 -> /usr/local/lib/python3.8/site-packages/django/contrib/messages/storage/__pycache__/fallback.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 78 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/generators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 79 -> /usr/local/lib/python3.8/site-packages/django/contrib/admindocs/__pycache__/__init__.cpython-38.pyc
lrwx------ 1 nobody nogroup 64 Jan  2 17:19 8 -> 'socket:[1725867362]'
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 80 -> /usr/local/lib/python3.8/site-packages/django/contrib/admindocs/__pycache__/views.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 81 -> /usr/local/lib/python3.8/site-packages/django/contrib/admin/views/__pycache__/decorators.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 82 -> /usr/local/lib/python3.8/site-packages/django/contrib/admindocs/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 83 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/inspectors.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 84 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/utils.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 85 -> /usr/local/lib/python3.8/site-packages/rest_framework/schemas/__pycache__/openapi.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 86 -> /usr/local/lib/python3.8/site-packages/rest_framework/__pycache__/renderers.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 87 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/scalarfloat.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 88 -> /usr/local/lib/python3.8/site-packages/urllib3/__pycache__/_base_connection.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 89 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/scalarbool.cpython-38.pyc
l-wx------ 1 nobody nogroup 64 Jan  3 18:23 9 -> /var/log/pods/<redacted>/gvisor.log
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 90 -> /usr/local/lib/python3.8/lib-dynload/math.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 904 -> /usr/lib/libintl.so.8.1.6
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 91 -> /usr/local/lib/python3.8/lib-dynload/_datetime.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 92 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/timestamp.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 93 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/emitter.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 94 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/serializer.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 95 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/representer.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 96 -> /usr/local/lib/python3.8/site-packages/coreapi/codecs/__pycache__/jsondata.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 97 -> /usr/local/lib/python3.8/lib-dynload/_heapq.cpython-38-x86_64-linux-gnu.so
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 98 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/resolver.cpython-38.pyc
lr-x------ 1 nobody nogroup 64 Jan  3 18:23 99 -> /usr/local/lib/python3.8/site-packages/ruamel/yaml/__pycache__/loader.cpython-38.pyc
$ ls -l /proc/2392576/fd
total 0
lr-x------ 1 root root 64 Jan  4 22:22 0 -> /dev/null
l-wx------ 1 root root 64 Jan  4 22:22 1 -> 'pipe:[1725864422]'
lrwx------ 1 root root 64 Jan  4 22:22 10 -> 'socket:[1725869557]'
lrwx------ 1 root root 64 Jan  4 22:22 11 -> 'anon_inode:[eventfd]'
l-wx------ 1 root root 64 Jan  4 22:22 2 -> 'pipe:[1725864423]'
lr-x------ 1 root root 64 Jan  4 22:22 3 -> anon_inode:inotify
lr-x------ 1 root root 64 Jan  4 22:22 4 -> 'pipe:[1725869555]'
lrwx------ 1 root root 64 Jan  4 22:22 5 -> 'anon_inode:[eventpoll]'
lr-x------ 1 root root 64 Jan  4 22:22 6 -> 'pipe:[1725869547]'
l-wx------ 1 root root 64 Jan  4 22:22 7 -> 'pipe:[1725869547]'
l-wx------ 1 root root 64 Jan  4 22:22 8 -> 'pipe:[1725869555]'
l-wx------ 1 root root 64 Jan  4 22:22 9 -> /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Thanks for the debugging output. The sandbox process is holding FDs to the destroyed container's rootfs.

@avagin pointed out that the gofer client (pkg/sentry/fsimpl/gofer, which is part of the sandbox process) is not cleaning up the dentry tree when the filesystem is unmounted.

Historically the reason for this was that all file descriptors were owned by the gofer server (part of the gofer process), and when a client disconnected, it would clean up all file descriptors and resources for that client. So cleaning up the dentry cache tree was not required, so we avoided it. However, with directfs the gofer client owns the required file descriptors for each dentry. Hence, failure to clean up cached dentries in the tree is causing this issue. I can send a fix.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

To confirm this theory, could you configure runsc with --directfs=false and see if this issue reproduces?

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@ayushr2 this is running with directfs=false already.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Ah, my bad I missed that.

Even with directfs=false, the gofer donates a host file descriptor for regular files to the sandbox (so that the sandbox can directly read/write to the FD without making RPCs).

I suspect that's what is going on, all the open FDs shown above seem to be of regular files.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Can you check if this issue reproduces with --ref-leak-mode=panic (which should have the effect of forcing dentry cache cleanup)? Alternatively, you could also patch #9867 and try to repro.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Sadly, I don't have a good reproducer for this so I can't easily confirm. I'll have to get a new version into the pipeline and let it sit for a bit to see if it happens again.

What doesn't quite check out to me (but might be totally down to my relative fresh-ness with gVisor in K8s) is that under the runsc-sandbox process, the actual processes of the container we've killed (i.e. the Python process) is still running and thus, intuitively, I feel like the FDs are correct to be there.

Does that still sound consistent with your theory?

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

under the runsc-sandbox process, the actual processes of the container we've killed (i.e. the Python process) is still running

Sorry if I am re-iterating what has already been discussed above; how are you observing that the containerized python process is still running? AFAIK, all processes of all containers in a sandbox run in the sentry context. The sentry allocates a "Task Goroutine" for each application task. So observing these containerized processes from the host should make it look like the sandbox process is running the application code. The sentry does not create host sub-processes for application processes.

So how do you observe that a certain python process in the to-be-deleted container is still running from the host? Or are you using application logs to confirm?

intuitively, I feel like the FDs are correct to be there.

The sentry imposes a "dentry cache", hence the sentry can be holding the FDs, even if the python application has closed it. To confirm if the python process is really behind holding these FDs, you can run ls -l /proc/{python-pid}/fd from within the said container to see what files it really has open. I doubt that the python application will be concurrently holding all these *.pyc files. This looks more like the doing of our dentry cache (which can hold up to 1000 leaf dentries).

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Here's some more info on the container above. Note how the container is in STOPPED state (and it's runsc and runsc-gofer processes are gone) but the sandbox with all the subprocesses is still running. Anecdotally, not sure if this helps narrowing this down or is something else entirely, I've also seen hangs where the container's runsc and runsc-gofer processes where not gone and containerd wasn't hanging umounting but it was hanging sending KILL signals but the processes not budging, apparently.

The mix of these is what lead me to pick the title of the issue 😅


Sandbox: 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
Container: 5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f

$ ctr -n k8s.io t ls | grep 5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f
5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f    2392492    STOPPED
ctr -n k8s.io t ls | grep 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8    2392492    RUNNING
root     2392474  0.0  0.1 1244020 17396 ?       Sl   Jan02   0:12 /usr/bin/containerd-shim-runsc-v1 -namespace k8s.io -address /run/containerd/containerd.sock -publish-binary /usr/bin/containerd
root     2392488  0.0  0.1 1261928 21736 ?       Ssl  Jan02   0:00  \_ runsc-gofer --systemd-cgroup=true --directfs=false --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --platform=systrap --log-fd=3 gofer --bundle=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 --gofer-mount-confs=lisafs:none,lisafs:none --io-fds=6,7 --mounts-fd=5 --spec-fd=4 --sync-nvproxy-fd=-1 --sync-userns-fd=-1 --proc-mount-sync-fd=14 --apply-caps=false --setup-root=false
nobody   2392492  2.1  0.4 3547332 77984 ?       Ssl  Jan02  90:55  \_ runsc-sandbox --platform=systrap --systemd-cgroup=true --directfs=false --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --log-fd=3 --panic-log-fd=4 boot --bundle=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 --controller-fd=10 --cpu-num=8 --dev-io-fd=-1 --gofer-mount-confs=lisafs:none,lisafs:none --io-fds=5,6 --mounts-fd=7 --setup-root=false --spec-fd=11 --start-sync-fd=8 --stdio-fds=12,13,14 --total-host-memory=16768503808 --total-memory=1073741824 --user-log-fd=9 --product-name=<redacted> --proc-mount-sync-fd=21 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8
nobody   2392527  0.0  0.0  16520    56 ?        Ss   Jan02   0:00  |   \_ [exe]
nobody   2392528  0.0  0.0  16520    56 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2392571  0.0  0.0  18316   140 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2392572  0.0  0.0  18316   140 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2393703  0.0  0.0  16520   104 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2393704  0.0  0.0  16520   104 ?        SN   Jan02   0:13  |       |   \_ [exe]
nobody   2393714  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2393715  0.0  0.0  16520    60 ?        SN   Jan02   0:46  |       |   \_ [exe]
nobody   2393716  0.0  0.0  16520    76 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2393717  0.0  0.0  16520    76 ?        SN   Jan02   0:14  |       |   \_ [exe]
nobody   2393832  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2393834  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2394049  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2394050  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2394183  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2394184  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2394185  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2394186  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2394187  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2394188  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |       |   \_ [exe]
nobody   2394189  0.0  0.0  16520    60 ?        S    Jan02   0:00  |       \_ [exe]
nobody   2394190  0.0  0.0  16520    60 ?        SN   Jan02   0:00  |           \_ [exe]
root     2392576  0.0  0.1 1253480 20360 ?       Sl   Jan02   0:00  \_ runsc --root=/run/containerd/runsc/k8s.io --log=/run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/log.json --log-format=json --panic-log=/var/log/pods/<redacted>/gvisor_panic.log --systemd-cgroup=true --platform=systrap --directfs=false wait 90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

To not clutter things too much, here's the state of the "anecdotal" container I was talking about above: https://gist.github.com/markusthoemmes/7ebf064b44b1a182f552fbe1cc1b9150. I've tried to gather as much info as I could think of on that one as well.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

but the sandbox with all the subprocesses is still running

Just to clarify, the [exe] subprocesses you see under runsc-sandbox are not application processes. They are systrap stub processes. The application processes do not run as separate host processes, they run in the sandbox process context. As mentioned earlier, the sandbox allocates a goroutine for each application task which keeps running until something interesting happens (application makes a syscall, gets interrupted, page faults, etc) and the control is transferred to the sentry which handles the event and transfers the control back to the application task goroutine to continue executing from the interrupted IP.

@avagin and I had a chat, some updates:

  • --ref-leak-mode=panic may lead to some random unrelated panics due to some ref leaks that Andrei is fixing.
  • This is likely not a dentry cache issue (as I hypothesized earlier). The said dentry cache issue only exists with directfs. When directfs is turned off, all the donated FDs are released here (without destroying the dentry tree). So --directfs=false should not be leaking FDs via the dentry cache.
  • We suspect that the filesystem Release() itself is not being called due to some leaking reference on the filesystem instance. Given that this bug is not consistently reproducible, we suspect that this leak is due to some race condition.
  • We should add some logs around filesystem release / mount namespace destruction to confirm if filesystem is being released or not.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Thanks @ayushr2 and @avagin for taking a look, it's greatly appreciated! Would the new hypothesis also explain the pod stuck like mentioned in #9834 (comment)? It seems like we're having both of these symptoms at the same time so they might be correlated or might be completely different 😅 .

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

I've looked at comparing differences of the shutdown behavior of "normally behaving" pods and hanging pods today to poke at this some more. Interestingly, the line that's missing in the hanging behavior is "shim disconnected" and sure enough, there's one leaked shim on this respective node! Does this go in favor of the cause you've been speculating above or does this potentially point at a containerd issue?

EDIT: Killing the shim in this case does not unwedge the situation.

normal pod
Jan 10 13:25:46 time="2024-01-10T13:25:46.990027836Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297 pid=76821
Jan 10 13:25:47 time="2024-01-10T13:25:47.336796129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:web-6467d599b5-bhfhg,Uid:3fbebc13-0b01-4c12-8572-27e50732610c,Namespace:app-c144ed07-7744-41b0-a252-7ecbfee6c762,Attempt:0,} returns sandbox id \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\""
Jan 10 13:25:47 time="2024-01-10T13:25:47.343437719Z" level=info msg="CreateContainer within sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" for container &ContainerMetadata{Name:tcp-sack-disable,Attempt:0,}"
Jan 10 13:25:47 time="2024-01-10T13:25:47.443995134Z" level=info msg="CreateContainer within sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" for &ContainerMetadata{Name:tcp-sack-disable,Attempt:0,} returns container id \"2935d81e5623df612628e910824ce6221bb6fd12e0a44d76ff8691fa41e6329d\""
Jan 10 13:25:48 time="2024-01-10T13:25:48.328420249Z" level=info msg="CreateContainer within sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" for container &ContainerMetadata{Name:web,Attempt:0,}"
Jan 10 13:25:48 time="2024-01-10T13:25:48.416263536Z" level=info msg="CreateContainer within sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" for &ContainerMetadata{Name:web,Attempt:0,} returns container id \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\""
Jan 10 13:25:48 time="2024-01-10T13:25:48.417194448Z" level=info msg="StartContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\""
Jan 10 13:25:48 time="2024-01-10T13:25:48.486346165Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485 pid=77015
Jan 10 13:25:48 time="2024-01-10T13:25:48.734819784Z" level=info msg="StartContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" returns successfully"
Jan 10 13:26:07 time="2024-01-10T13:26:07.727660023Z" level=info msg="StopContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" with timeout 30 (s)"
Jan 10 13:26:07 time="2024-01-10T13:26:07.754940615Z" level=info msg="Stop container \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" with signal terminated"
Jan 10 13:26:08 time="2024-01-10T13:26:08.074980044Z" level=info msg="shim disconnected" id=20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485 namespace=k8s.io
Jan 10 13:26:08 time="2024-01-10T13:26:08.075087976Z" level=warning msg="cleaning up after shim disconnected" id=20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485 namespace=k8s.io
Jan 10 13:26:08 time="2024-01-10T13:26:08.132746796Z" level=info msg="StopContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" returns successfully"
Jan 10 13:26:08 time="2024-01-10T13:26:08.133860348Z" level=info msg="StopPodSandbox for \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\""
Jan 10 13:26:08 time="2024-01-10T13:26:08.134509591Z" level=info msg="Container to stop \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" must be in running or unknown state, current state \"CONTAINER_EXITED\""
Jan 10 13:26:08 time="2024-01-10T13:26:08.385464748Z" level=warning msg="Ignoring error killing container \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\": \"runsc\" did not terminate successfully: sandbox is not running\n"
Jan 10 13:26:08 time="2024-01-10T13:26:08.429921787Z" level=warning msg="Ignoring error killing container \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\": \"runsc\" did not terminate successfully: sandbox is not running\n"
Jan 10 13:26:08 time="2024-01-10T13:26:08.491732736Z" level=info msg="shim disconnected" id=2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297 namespace=k8s.io
Jan 10 13:26:08 time="2024-01-10T13:26:08.491844148Z" level=warning msg="cleaning up after shim disconnected" id=2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297 namespace=k8s.io
Jan 10 13:26:08 time="2024-01-10T13:26:08.716084407Z" level=info msg="TearDown network for sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" successfully"
Jan 10 13:26:08 time="2024-01-10T13:26:08.716163793Z" level=info msg="StopPodSandbox for \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" returns successfully"
Jan 10 13:26:09 time="2024-01-10T13:26:09.493819752Z" level=info msg="RemoveContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\""
Jan 10 13:26:09 time="2024-01-10T13:26:09.507483329Z" level=info msg="RemoveContainer for \"20c6f962ad6bfd62eea81aa2536845631260d37ea2e999e21af0c00813a96485\" returns successfully"
Jan 10 13:26:35 time="2024-01-10T13:26:35.902586939Z" level=info msg="StopPodSandbox for \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\""
Jan 10 13:26:35 time="2024-01-10T13:26:35.985525705Z" level=info msg="TearDown network for sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" successfully"
Jan 10 13:26:35 time="2024-01-10T13:26:35.985689080Z" level=info msg="StopPodSandbox for \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" returns successfully"
Jan 10 13:26:35 time="2024-01-10T13:26:35.986557729Z" level=info msg="RemovePodSandbox for \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\""
Jan 10 13:26:35 time="2024-01-10T13:26:35.986805068Z" level=info msg="Forcibly stopping sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\""
Jan 10 13:26:36 time="2024-01-10T13:26:36.083872934Z" level=info msg="TearDown network for sandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" successfully"
Jan 10 13:26:36 time="2024-01-10T13:26:36.095668006Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
Jan 10 13:26:36 time="2024-01-10T13:26:36.095893143Z" level=info msg="RemovePodSandbox \"2b3441530df5090a30594459102fdaacd11c23030f10de327e3286fbe2bc9297\" returns successfully"
hanging pod
Dec 16 22:06:31 time="2023-12-16T22:06:31.694963790Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e pid=3673804
Dec 16 22:06:32 time="2023-12-16T22:06:32.471636062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:<redacted>-785b769988-dxxz5,Uid:aaea1146-81d8-4be2-a607-5eaf664f5d23,Namespace:app-b0b43bde-860d-404b-81ad-91c8048ad2d4,Attempt:0,} returns sandbox id \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\""
Dec 16 22:06:46 time="2023-12-16T22:06:46.326258542Z" level=info msg="CreateContainer within sandbox \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\" for container &ContainerMetadata{Name:<redacted>,Attempt:0,}"
Dec 16 22:06:46 time="2023-12-16T22:06:46.382285753Z" level=info msg="CreateContainer within sandbox \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\" for &ContainerMetadata{Name:<redacted>,Attempt:0,} returns container id \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Dec 16 22:06:46 time="2023-12-16T22:06:46.383382537Z" level=info msg="StartContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Dec 16 22:06:46 time="2023-12-16T22:06:46.450940918Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654 pid=3674058
Dec 16 22:06:46 time="2023-12-16T22:06:46.769695454Z" level=info msg="StartContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" returns successfully"
Jan 05 17:32:52 time="2024-01-05T17:32:52.252409222Z" level=info msg="StopContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" with timeout 30 (s)"
Jan 05 17:32:52 time="2024-01-05T17:32:52.295471813Z" level=info msg="Stop container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" with signal terminated"
Jan 05 17:32:55 time="2024-01-05T17:32:55.160512115Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:32:55 time="2024-01-05T17:32:55.160856356Z" level=error msg="failed to handle container TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}" error="failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:32:56 time="2024-01-05T17:32:56.729668286Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:32:56 time="2024-01-05T17:32:56.770274706Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:32:59 time="2024-01-05T17:32:59.310886184Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:32:59 time="2024-01-05T17:32:59.311230290Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:33:01 time="2024-01-05T17:33:01.730574024Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:33:01 time="2024-01-05T17:33:01.774301890Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:33:04 time="2024-01-05T17:33:04.332275380Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:33:04 time="2024-01-05T17:33:04.332605875Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:33:08 time="2024-01-05T17:33:08.730629387Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:33:08 time="2024-01-05T17:33:08.766274282Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:33:11 time="2024-01-05T17:33:11.299057549Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:33:11 time="2024-01-05T17:33:11.299326517Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:33:19 time="2024-01-05T17:33:19.729993251Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:33:19 time="2024-01-05T17:33:19.757108867Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:33:22 time="2024-01-05T17:33:22.299294820Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:33:22 time="2024-01-05T17:33:22.299593708Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:33:22 time="2024-01-05T17:33:22.378464616Z" level=info msg="Kill container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Jan 05 17:33:38 time="2024-01-05T17:33:38.730174869Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:33:38 time="2024-01-05T17:33:38.773274415Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:33:41 time="2024-01-05T17:33:41.323214832Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:33:41 time="2024-01-05T17:33:41.323537227Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:34:13 time="2024-01-05T17:34:13.730166629Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:34:13 time="2024-01-05T17:34:13.785287279Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:34:16 time="2024-01-05T17:34:16.343265907Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:34:16 time="2024-01-05T17:34:16.343540874Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:35:20 time="2024-01-05T17:35:20.730572316Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:35:20 time="2024-01-05T17:35:20.766965653Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:35:23 time="2024-01-05T17:35:23.311337226Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:35:23 time="2024-01-05T17:35:23.311654494Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:37:31 time="2024-01-05T17:37:31.730522120Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:37:31 time="2024-01-05T17:37:31.765935172Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:37:34 time="2024-01-05T17:37:34.307214039Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:37:34 time="2024-01-05T17:37:34.307557371Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:41:50 time="2024-01-05T17:41:50.730533261Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:41:50 time="2024-01-05T17:41:50.777031189Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:41:53 time="2024-01-05T17:41:53.323929164Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:41:53 time="2024-01-05T17:41:53.324245891Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:46:53 time="2024-01-05T17:46:53.729665501Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:46:53 time="2024-01-05T17:46:53.761647256Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:46:56 time="2024-01-05T17:46:56.310856527Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:46:56 time="2024-01-05T17:46:56.311164365Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:48:22 time="2024-01-05T17:48:22.252997700Z" level=error msg="StopContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" to be killed: wait container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": context canceled"
Jan 05 17:48:22 time="2024-01-05T17:48:22.253433053Z" level=info msg="StopPodSandbox for \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\""
Jan 05 17:48:22 time="2024-01-05T17:48:22.253919924Z" level=info msg="Kill container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Jan 05 17:51:56 time="2024-01-05T17:51:56.730445189Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:51:56 time="2024-01-05T17:51:56.762112322Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:51:59 time="2024-01-05T17:51:59.311656664Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:51:59 time="2024-01-05T17:51:59.311992214Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 17:56:59 time="2024-01-05T17:56:59.729840744Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 17:56:59 time="2024-01-05T17:56:59.770248899Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 17:57:02 time="2024-01-05T17:57:02.322940190Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 17:57:02 time="2024-01-05T17:57:02.323243004Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:02:02 time="2024-01-05T18:02:02.733415051Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:02:02 time="2024-01-05T18:02:02.769677406Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:02:05 time="2024-01-05T18:02:05.306154539Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:02:05 time="2024-01-05T18:02:05.306455341Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:03:22 time="2024-01-05T18:03:22.253450510Z" level=error msg="StopPodSandbox for \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": an error occurs during waiting for container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" to be killed: wait container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": context deadline exceeded"
Jan 05 18:03:22 time="2024-01-05T18:03:22.421075937Z" level=info msg="StopContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" with timeout 30 (s)"
Jan 05 18:03:22 time="2024-01-05T18:03:22.421568699Z" level=info msg="Skipping the sending of signal terminated to container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" because a prior stop with timeout>0 request already sent the signal"
Jan 05 18:03:52 time="2024-01-05T18:03:52.422075472Z" level=info msg="Kill container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Jan 05 18:07:05 time="2024-01-05T18:07:05.729853592Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:07:05 time="2024-01-05T18:07:05.765661493Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:07:08 time="2024-01-05T18:07:08.309677808Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:07:08 time="2024-01-05T18:07:08.310468093Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:12:08 time="2024-01-05T18:12:08.729708388Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:12:08 time="2024-01-05T18:12:08.765210416Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:12:11 time="2024-01-05T18:12:11.307836905Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:12:11 time="2024-01-05T18:12:11.308110309Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:17:11 time="2024-01-05T18:17:11.730439246Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:17:11 time="2024-01-05T18:17:11.761119608Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:17:14 time="2024-01-05T18:17:14.314829398Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:17:14 time="2024-01-05T18:17:14.315176288Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:18:52 time="2024-01-05T18:18:52.421382140Z" level=error msg="StopContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" to be killed: wait container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": context canceled"
Jan 05 18:18:52 time="2024-01-05T18:18:52.421634607Z" level=info msg="StopPodSandbox for \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\""
Jan 05 18:18:52 time="2024-01-05T18:18:52.422122527Z" level=info msg="Kill container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\""
Jan 05 18:22:14 time="2024-01-05T18:22:14.729702990Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:22:14 time="2024-01-05T18:22:14.771433591Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:22:17 time="2024-01-05T18:22:17.304346336Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:22:17 time="2024-01-05T18:22:17.304887376Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:27:17 time="2024-01-05T18:27:17.734263260Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:27:17 time="2024-01-05T18:27:17.769513050Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:27:20 time="2024-01-05T18:27:20.306682270Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:27:20 time="2024-01-05T18:27:20.307404351Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:32:20 time="2024-01-05T18:32:20.730165337Z" level=info msg="TaskExit event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248}"
Jan 05 18:32:20 time="2024-01-05T18:32:20.785354426Z" level=warning msg="Ignoring error killing container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": \"runsc\" did not terminate successfully: loading container: file does not exist\n"
Jan 05 18:32:23 time="2024-01-05T18:32:23.344511776Z" level=warning msg="failed to cleanup rootfs mount" error="failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy" runtime=io.containerd.runsc.v1
Jan 05 18:32:23 time="2024-01-05T18:32:23.344861924Z" level=error msg="Failed to handle backOff event container_id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" id:\"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" pid:3673819 exit_status:143 exited_at:{seconds:1704475972 nanos:453458248} for b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654/rootfs: device or resource busy: unknown"
Jan 05 18:33:52 time="2024-01-05T18:33:52.421774753Z" level=error msg="StopPodSandbox for \"1a2642c3a4ffb0f8e24d51979e1ddc8ebb6109d8c48b952242ec86583c8bd99e\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": an error occurs during waiting for container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" to be killed: wait container \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\": context deadline exceeded"
Jan 05 18:33:52 time="2024-01-05T18:33:52.806993016Z" level=info msg="StopContainer for \"b8206ab1e1639d030523c7057b398796cc6b9e4c16ba514157fd8fa228160654\" with timeout 30 (s)"
...

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Aaaaand one more datapoint in my quest to find the nugget of info that I'm looking for: The goroutine dump of the "leaked" shim:

goroutine dump of leaked shim
SIGQUIT: quit
PC=0x46df01 m=0 sigcode=0

goroutine 0 [idle]:
runtime.futex()
        src/runtime/sys_linux_amd64.s:557 +0x21 fp=0x7fff6bfe6770 sp=0x7fff6bfe6768 pc=0x46df01
runtime.futexsleep(0x7fff6bfe67e8?, 0x440396?, 0x7fff6bfe67e8?)
        GOROOT/src/runtime/os_linux.go:69 +0x30 fp=0x7fff6bfe67c0 sp=0x7fff6bfe6770 pc=0x4355d0
runtime.notesleep(0x158b9c8)
        GOROOT/src/runtime/lock_futex.go:160 +0x87 fp=0x7fff6bfe67f8 sp=0x7fff6bfe67c0 pc=0x40ea47
runtime.mPark(...)
        GOROOT/src/runtime/proc.go:1632
runtime.stoplockedm()
        GOROOT/src/runtime/proc.go:2780 +0x73 fp=0x7fff6bfe6850 sp=0x7fff6bfe67f8 pc=0x440573
runtime.schedule()
        GOROOT/src/runtime/proc.go:3561 +0x3a fp=0x7fff6bfe6888 sp=0x7fff6bfe6850 pc=0x4423ba
runtime.park_m(0xc000007d40?)
        GOROOT/src/runtime/proc.go:3745 +0x11f fp=0x7fff6bfe68d0 sp=0x7fff6bfe6888 pc=0x44293f
runtime.mcall()
        src/runtime/asm_amd64.s:458 +0x4e fp=0x7fff6bfe68e8 sp=0x7fff6bfe68d0 pc=0x46a26e

goroutine 1 [select, 2 minutes]:
runtime.gopark(0xc00006f678?, 0x2?, 0x1?, 0x0?, 0xc00006f61c?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00006f4b8 sp=0xc00006f498 pc=0x43bf4e
runtime.selectgo(0xc00006f678, 0xc00006f618, 0x0?, 0x0, 0x2?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc00006f5d8 sp=0xc00006f4b8 pc=0x44bac5
github.com/containerd/containerd/runtime/v2/shim.handleSignals({0xed8d40, 0xc00008aa00}, 0xdb4926?, 0xc00016c600)
        external/com_github_containerd_containerd/runtime/v2/shim/shim_unix.go:77 +0xf8 fp=0xc00006f7e8 sp=0xc00006f5d8 pc=0x724718
github.com/containerd/containerd/runtime/v2/shim.(*Client).Serve(0xc00006fbb8)
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:298 +0x356 fp=0xc00006f9b8 sp=0xc00006f7e8 pc=0x723d96
github.com/containerd/containerd/runtime/v2/shim.run({0xdc00db, 0x16}, 0xdeb800, {0xf0?, 0xc6?, 0x5?})
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:237 +0xb2d fp=0xc00006fe70 sp=0xc00006f9b8 pc=0x72360d
github.com/containerd/containerd/runtime/v2/shim.Run({0xdc00db, 0x16}, 0xc0000400b8?, {0x0, 0x0, 0xc0000061a0?})
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:153 +0xab fp=0xc00006ff00 sp=0xc00006fe70 pc=0x7229eb
gvisor.dev/gvisor/shim/cli.Main(...)
        shim/cli/cli.go:27
main.main()
        shim/main.go:23 +0x2e fp=0xc00006ff40 sp=0xc00006ff00 pc=0xba34ee
runtime.main()
        GOROOT/src/runtime/proc.go:267 +0x2bb fp=0xc00006ffe0 sp=0xc00006ff40 pc=0x43bafb
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00006ffe8 sp=0xc00006ffe0 pc=0x46c0e1

goroutine 2 [force gc (idle), 7 minutes]:
runtime.gopark(0x8f49138fbd4de?, 0x0?, 0x0?, 0x0?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005cfa8 sp=0xc00005cf88 pc=0x43bf4e
runtime.goparkunlock(...)
        GOROOT/src/runtime/proc.go:404
runtime.forcegchelper()
        GOROOT/src/runtime/proc.go:322 +0xb3 fp=0xc00005cfe0 sp=0xc00005cfa8 pc=0x43bdd3
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005cfe8 sp=0xc00005cfe0 pc=0x46c0e1
created by runtime.init.6 in goroutine 1
        GOROOT/src/runtime/proc.go:310 +0x1a

goroutine 3 [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005d778 sp=0xc00005d758 pc=0x43bf4e
runtime.goparkunlock(...)
        GOROOT/src/runtime/proc.go:404
runtime.bgsweep(0x0?)
        GOROOT/src/runtime/mgcsweep.go:321 +0xdf fp=0xc00005d7c8 sp=0xc00005d778 pc=0x427e5f
runtime.gcenable.func1()
        GOROOT/src/runtime/mgc.go:200 +0x25 fp=0xc00005d7e0 sp=0xc00005d7c8 pc=0x41cf05
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005d7e8 sp=0xc00005d7e0 pc=0x46c0e1
created by runtime.gcenable in goroutine 1
        GOROOT/src/runtime/mgc.go:200 +0x66

goroutine 4 [GC scavenge wait]:
runtime.gopark(0x158adc0?, 0xec7500?, 0x0?, 0x0?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005df70 sp=0xc00005df50 pc=0x43bf4e
runtime.goparkunlock(...)
        GOROOT/src/runtime/proc.go:404
runtime.(*scavengerState).park(0x158adc0)
        GOROOT/src/runtime/mgcscavenge.go:425 +0x49 fp=0xc00005dfa0 sp=0xc00005df70 pc=0x4256a9
runtime.bgscavenge(0x0?)
        GOROOT/src/runtime/mgcscavenge.go:658 +0x59 fp=0xc00005dfc8 sp=0xc00005dfa0 pc=0x425c59
runtime.gcenable.func2()
        GOROOT/src/runtime/mgc.go:201 +0x25 fp=0xc00005dfe0 sp=0xc00005dfc8 pc=0x41cea5
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005dfe8 sp=0xc00005dfe0 pc=0x46c0e1
created by runtime.gcenable in goroutine 1
        GOROOT/src/runtime/mgc.go:201 +0xa5

goroutine 5 [finalizer wait, 1 minutes]:
runtime.gopark(0x0?, 0xdeb990?, 0x0?, 0xc0?, 0x2000000020?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001aee28 sp=0xc0001aee08 pc=0x43bf4e
runtime.runfinq()
        GOROOT/src/runtime/mfinal.go:193 +0x107 fp=0xc0001aefe0 sp=0xc0001aee28 pc=0x41bf07
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001aefe8 sp=0xc0001aefe0 pc=0x46c0e1
created by runtime.createfing in goroutine 1
        GOROOT/src/runtime/mfinal.go:163 +0x3d

goroutine 6 [chan receive]:
runtime.gopark(0x15a53e0?, 0x2?, 0x40?, 0xe7?, 0x431ccb?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005e700 sp=0xc00005e6e0 pc=0x43bf4e
runtime.chanrecv(0xc000188000, 0xc00005e7b8, 0x1)
        GOROOT/src/runtime/chan.go:583 +0x3cd fp=0xc00005e778 sp=0xc00005e700 pc=0x409a0d
runtime.chanrecv2(0x6fc23ac00?, 0x0?)
        GOROOT/src/runtime/chan.go:447 +0x12 fp=0xc00005e7a0 sp=0xc00005e778 pc=0x409632
github.com/containerd/containerd/runtime/v2/shim.setRuntime.func1()
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:120 +0x5a fp=0xc00005e7e0 sp=0xc00005e7a0 pc=0x7256da
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005e7e8 sp=0xc00005e7e0 pc=0x46c0e1
created by github.com/containerd/containerd/runtime/v2/shim.setRuntime in goroutine 1
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:119 +0x25

goroutine 7 [select, 33307 minutes, locked to thread]:
runtime.gopark(0xc00005efa8?, 0x2?, 0x60?, 0xee?, 0xc00005efa4?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005ee38 sp=0xc00005ee18 pc=0x43bf4e
runtime.selectgo(0xc00005efa8, 0xc00005efa0, 0x0?, 0x0, 0x0?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc00005ef58 sp=0xc00005ee38 pc=0x44bac5
runtime.ensureSigM.func1()
        GOROOT/src/runtime/signal_unix.go:1014 +0x19f fp=0xc00005efe0 sp=0xc00005ef58 pc=0x462f9f
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005efe8 sp=0xc00005efe0 pc=0x46c0e1
created by runtime.ensureSigM in goroutine 1
        GOROOT/src/runtime/signal_unix.go:997 +0xc8

goroutine 8 [syscall, 1 minutes]:
runtime.notetsleepg(0xffffffffffffffff?, 0xc00005f728?)
        GOROOT/src/runtime/lock_futex.go:236 +0x29 fp=0xc00005f7a0 sp=0xc00005f768 pc=0x40ed29
os/signal.signal_recv()
        GOROOT/src/runtime/sigqueue.go:152 +0x29 fp=0xc00005f7c0 sp=0xc00005f7a0 pc=0x4686e9
os/signal.loop()
        GOROOT/src/os/signal/signal_unix.go:23 +0x13 fp=0xc00005f7e0 sp=0xc00005f7c0 pc=0x719c53
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005f7e8 sp=0xc00005f7e0 pc=0x46c0e1
created by os/signal.Notify.func1.1 in goroutine 1
        GOROOT/src/os/signal/signal.go:151 +0x1f

goroutine 9 [chan receive, 33307 minutes]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005fea8 sp=0xc00005fe88 pc=0x43bf4e
runtime.chanrecv(0xc00016c720, 0xc00005ff90, 0x1)
        GOROOT/src/runtime/chan.go:583 +0x3cd fp=0xc00005ff20 sp=0xc00005fea8 pc=0x409a0d
runtime.chanrecv2(0x0?, 0x0?)
        GOROOT/src/runtime/chan.go:447 +0x12 fp=0xc00005ff48 sp=0xc00005ff20 pc=0x409632
github.com/containerd/containerd/runtime/v2/shim.(*RemoteEventsPublisher).processQueue(0xc0001701e0)
        external/com_github_containerd_containerd/runtime/v2/shim/publisher.go:80 +0x45 fp=0xc00005ffc8 sp=0xc00005ff48 pc=0x721ce5
github.com/containerd/containerd/runtime/v2/shim.NewPublisher.func2()
        external/com_github_containerd_containerd/runtime/v2/shim/publisher.go:56 +0x25 fp=0xc00005ffe0 sp=0xc00005ffc8 pc=0x721ac5
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005ffe8 sp=0xc00005ffe0 pc=0x46c0e1
created by github.com/containerd/containerd/runtime/v2/shim.NewPublisher in goroutine 1
        external/com_github_containerd_containerd/runtime/v2/shim/publisher.go:56 +0x15b

goroutine 10 [select, 33307 minutes]:
runtime.gopark(0xc00008cdf8?, 0x2?, 0x0?, 0x0?, 0xc00008cdac?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00008cc48 sp=0xc00008cc28 pc=0x43bf4e
runtime.selectgo(0xc00008cdf8, 0xc00008cda8, 0x0?, 0x0, 0x0?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc00008cd68 sp=0xc00008cc48 pc=0x44bac5
gvisor.dev/gvisor/pkg/shim.(*watcherV2).run(0xc000011410, {0xed8d40, 0xc00008aa00})
        pkg/shim/oom_v2.go:62 +0x130 fp=0xc00008cfb8 sp=0xc00008cd68 pc=0xb992d0
gvisor.dev/gvisor/pkg/shim.New.func1()
        pkg/shim/service.go:112 +0x2d fp=0xc00008cfe0 sp=0xc00008cfb8 pc=0xb9a12d
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00008cfe8 sp=0xc00008cfe0 pc=0x46c0e1
created by gvisor.dev/gvisor/pkg/shim.New in goroutine 1
        pkg/shim/service.go:112 +0x1fe

goroutine 11 [chan receive, 66 minutes]:
runtime.gopark(0xc0000a8900?, 0xed8d40?, 0x0?, 0xaa?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001afe58 sp=0xc0001afe38 pc=0x43bf4e
runtime.chanrecv(0xc00016c5a0, 0xc000025f78, 0x1)
        GOROOT/src/runtime/chan.go:583 +0x3cd fp=0xc0001afed0 sp=0xc0001afe58 pc=0x409a0d
runtime.chanrecv2(0xc0000a8900?, 0xed8d40?)
        GOROOT/src/runtime/chan.go:447 +0x12 fp=0xc0001afef8 sp=0xc0001afed0 pc=0x409632
gvisor.dev/gvisor/pkg/shim.(*service).processExits(0xc0000a8900, {0xed8d40, 0xc00008aa00})
        pkg/shim/service.go:958 +0xf0 fp=0xc0001affb8 sp=0xc0001afef8 pc=0xba0b10
gvisor.dev/gvisor/pkg/shim.New.func2()
        pkg/shim/service.go:122 +0x28 fp=0xc0001affe0 sp=0xc0001affb8 pc=0xb9a0c8
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001affe8 sp=0xc0001affe0 pc=0x46c0e1
created by gvisor.dev/gvisor/pkg/shim.New in goroutine 1
        pkg/shim/service.go:122 +0x399

goroutine 12 [syscall, 7 minutes]:
syscall.Syscall6(0x46a2d2?, 0xc00008d8e8?, 0x44c865?, 0xc00008d8d0?, 0x44c8a0?, 0xc000007a00?, 0x4?)
        GOROOT/src/syscall/syscall_linux.go:91 +0x30 fp=0xc00008d8b0 sp=0xc00008d828 pc=0x4837b0
syscall.Syscall6(0xe8, 0x4, 0xc00008d9b0, 0x80, 0xffffffffffffffff, 0x0, 0x0)
        <autogenerated>:1 +0x3d fp=0xc00008d8f8 sp=0xc00008d8b0 pc=0x4841dd
golang.org/x/sys/unix.EpollWait(0xc00024ab70?, {0xc00008d9b0?, 0x0?, 0x0?}, 0x0?)
        external/org_golang_x_sys/unix/zsyscall_linux_amd64.go:56 +0x4f fp=0xc00008d968 sp=0xc00008d8f8 pc=0x66336f
github.com/containerd/console.(*Epoller).Wait(0xc000170420)
        external/com_github_containerd_console/console_linux.go:111 +0x5d fp=0xc00008dfc8 sp=0xc00008d968 pc=0x71341d
gvisor.dev/gvisor/pkg/shim.(*service).initPlatform.func1()
        pkg/shim/service_linux.go:107 +0x25 fp=0xc00008dfe0 sp=0xc00008dfc8 pc=0xba2d25
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00008dfe8 sp=0xc00008dfe0 pc=0x46c0e1
created by gvisor.dev/gvisor/pkg/shim.(*service).initPlatform in goroutine 1
        pkg/shim/service_linux.go:107 +0x105

goroutine 13 [chan receive, 66 minutes]:
runtime.gopark(0xc00024ad20?, 0xc000215f18?, 0x7c?, 0x21?, 0xc0001701e0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc000215e88 sp=0xc000215e68 pc=0x43bf4e
runtime.chanrecv(0xc00016c780, 0xc000215f78, 0x1)
        GOROOT/src/runtime/chan.go:583 +0x3cd fp=0xc000215f00 sp=0xc000215e88 pc=0x409a0d
runtime.chanrecv2(0xd54940?, 0xed8d40?)
        GOROOT/src/runtime/chan.go:447 +0x12 fp=0xc000215f28 sp=0xc000215f00 pc=0x409632
gvisor.dev/gvisor/pkg/shim.(*service).forward(0x0?, {0xed8d40, 0xc00008aa00}, {0xecece8, 0xc0001701e0})
        pkg/shim/service.go:1017 +0x52 fp=0xc000215fa8 sp=0xc000215f28 pc=0xba1392
gvisor.dev/gvisor/pkg/shim.New.func3()
        pkg/shim/service.go:128 +0x30 fp=0xc000215fe0 sp=0xc000215fa8 pc=0xb9a070
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000215fe8 sp=0xc000215fe0 pc=0x46c0e1
created by gvisor.dev/gvisor/pkg/shim.New in goroutine 1
        pkg/shim/service.go:128 +0x525

goroutine 18 [select]:
runtime.gopark(0xc0001adf68?, 0x4?, 0x5?, 0xec?, 0xc0001add30?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001adbb8 sp=0xc0001adb98 pc=0x43bf4e
runtime.selectgo(0xc0001adf68, 0xc0001add28, 0x1564510?, 0x0, 0x25?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc0001adcd8 sp=0xc0001adbb8 pc=0x44bac5
github.com/containerd/ttrpc.(*serverConn).run(0xc00008aa50, {0xed8d40, 0xc00008aa00})
        external/com_github_containerd_ttrpc/server.go:431 +0x51b fp=0xc0001adfb8 sp=0xc0001adcd8 pc=0x6746db
github.com/containerd/ttrpc.(*Server).Serve.func2()
        external/com_github_containerd_ttrpc/server.go:127 +0x28 fp=0xc0001adfe0 sp=0xc0001adfb8 pc=0x6738a8
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001adfe8 sp=0xc0001adfe0 pc=0x46c0e1
created by github.com/containerd/ttrpc.(*Server).Serve in goroutine 16
        external/com_github_containerd_ttrpc/server.go:127 +0x252

goroutine 16 [IO wait, 33307 minutes]:
runtime.gopark(0xc00017c000?, 0xc000040780?, 0xa8?, 0xc?, 0x4e259d?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc000090c38 sp=0xc000090c18 pc=0x43bf4e
runtime.netpollblock(0x11?, 0x407e46?, 0x0?)
        GOROOT/src/runtime/netpoll.go:564 +0xf7 fp=0xc000090c70 sp=0xc000090c38 pc=0x434997
internal/poll.runtime_pollWait(0x7f6ca5529d88, 0x72)
        GOROOT/src/runtime/netpoll.go:343 +0x85 fp=0xc000090c90 sp=0xc000090c70 pc=0x466705
internal/poll.(*pollDesc).wait(0xc0000e7280?, 0x661352?, 0x0)
        GOROOT/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc000090cb8 sp=0xc000090c90 pc=0x4db207
internal/poll.(*pollDesc).waitRead(...)
        GOROOT/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc0000e7280)
        GOROOT/src/internal/poll/fd_unix.go:611 +0x2ac fp=0xc000090d60 sp=0xc000090cb8 pc=0x4e06ec
net.(*netFD).accept(0xc0000e7280)
        GOROOT/src/net/fd_unix.go:172 +0x29 fp=0xc000090e18 sp=0xc000090d60 pc=0x533b29
net.(*UnixListener).accept(0x4442e0?)
        GOROOT/src/net/unixsock_posix.go:172 +0x16 fp=0xc000090e40 sp=0xc000090e18 pc=0x54d8f6
net.(*UnixListener).Accept(0xc0001705a0)
        GOROOT/src/net/unixsock.go:260 +0x30 fp=0xc000090e70 sp=0xc000090e40 pc=0x54c030
github.com/containerd/ttrpc.(*Server).Serve(0xc000170540, {0xed8d40?, 0xc00008aa00}, {0xed5250, 0xc0001705a0})
        external/com_github_containerd_ttrpc/server.go:87 +0x122 fp=0xc000090f60 sp=0xc000090e70 pc=0x673502
github.com/containerd/containerd/runtime/v2/shim.serve.func1()
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:310 +0x6d fp=0xc000090fe0 sp=0xc000090f60 pc=0x72400d
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000090fe8 sp=0xc000090fe0 pc=0x46c0e1
created by github.com/containerd/containerd/runtime/v2/shim.serve in goroutine 1
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:308 +0xcd

goroutine 17 [chan receive, 1 minutes]:
runtime.gopark(0x1?, 0x1?, 0x20?, 0x6e?, 0xc0000485b0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001abf00 sp=0xc0001abee0 pc=0x43bf4e
runtime.chanrecv(0xc00016ca20, 0xc000029fc0, 0x1)
        GOROOT/src/runtime/chan.go:583 +0x3cd fp=0xc0001abf78 sp=0xc0001abf00 pc=0x409a0d
runtime.chanrecv2(0x0?, 0x0?)
        GOROOT/src/runtime/chan.go:447 +0x12 fp=0xc0001abfa0 sp=0xc0001abf78 pc=0x409632
github.com/containerd/containerd/runtime/v2/shim.(*Client).Serve.func1()
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:294 +0x47 fp=0xc0001abfe0 sp=0xc0001abfa0 pc=0x723e47
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001abfe8 sp=0xc0001abfe0 pc=0x46c0e1
created by github.com/containerd/containerd/runtime/v2/shim.(*Client).Serve in goroutine 1
        external/com_github_containerd_containerd/runtime/v2/shim/shim.go:293 +0x338

goroutine 19 [IO wait]:
runtime.gopark(0xc000092d20?, 0xb?, 0x0?, 0x0?, 0xb?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc000092b48 sp=0xc000092b28 pc=0x43bf4e
runtime.netpollblock(0x481698?, 0x407e46?, 0x0?)
        GOROOT/src/runtime/netpoll.go:564 +0xf7 fp=0xc000092b80 sp=0xc000092b48 pc=0x434997
internal/poll.runtime_pollWait(0x7f6ca5529c90, 0x72)
        GOROOT/src/runtime/netpoll.go:343 +0x85 fp=0xc000092ba0 sp=0xc000092b80 pc=0x466705
internal/poll.(*pollDesc).wait(0xc0000e7380?, 0xc00020c000?, 0x0)
        GOROOT/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc000092bc8 sp=0xc000092ba0 pc=0x4db207
internal/poll.(*pollDesc).waitRead(...)
        GOROOT/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc0000e7380, {0xc00020c000, 0x1000, 0x1000})
        GOROOT/src/internal/poll/fd_unix.go:164 +0x27a fp=0xc000092c60 sp=0xc000092bc8 pc=0x4dc4fa
net.(*netFD).Read(0xc0000e7380, {0xc00020c000?, 0x0?, 0xc000092d00?})
        GOROOT/src/net/fd_posix.go:55 +0x25 fp=0xc000092ca8 sp=0xc000092c60 pc=0x531b05
net.(*conn).Read(0xc000060138, {0xc00020c000?, 0xc000092d48?, 0x408ff3?})
        GOROOT/src/net/net.go:179 +0x45 fp=0xc000092cf0 sp=0xc000092ca8 pc=0x53ea45
net.(*UnixConn).Read(0xc000092d48?, {0xc00020c000?, 0x409060?, 0x1?})
        <autogenerated>:1 +0x25 fp=0xc000092d20 sp=0xc000092cf0 pc=0x550da5
bufio.(*Reader).Read(0xc00016cba0, {0xc00009f320, 0xa, 0xc000092ed4?})
        GOROOT/src/bufio/bufio.go:244 +0x197 fp=0xc000092d58 sp=0xc000092d20 pc=0x50af97
io.ReadAtLeast({0xecc720, 0xc00016cba0}, {0xc00009f320, 0xa, 0xa}, 0xa)
        GOROOT/src/io/io.go:335 +0x90 fp=0xc000092da0 sp=0xc000092d58 pc=0x4d59b0
io.ReadFull(...)
        GOROOT/src/io/io.go:354
github.com/containerd/ttrpc.readMessageHeader({0xc00009f320, 0xa, 0x18?}, {0xecc720?, 0xc00016cba0?})
        external/com_github_containerd_ttrpc/channel.go:53 +0x4f fp=0xc000092de0 sp=0xc000092da0 pc=0x66fd4f
github.com/containerd/ttrpc.(*channel).recv(0xc00009f300)
        external/com_github_containerd_ttrpc/channel.go:101 +0x45 fp=0xc000092e78 sp=0xc000092de0 pc=0x670245
github.com/containerd/ttrpc.(*serverConn).run.func1(0xc00016cc00)
        external/com_github_containerd_ttrpc/server.go:362 +0x13b fp=0xc000092fc8 sp=0xc000092e78 pc=0x674f3b
github.com/containerd/ttrpc.(*serverConn).run.func6()
        external/com_github_containerd_ttrpc/server.go:413 +0x27 fp=0xc000092fe0 sp=0xc000092fc8 pc=0x674dc7
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc000092fe8 sp=0xc000092fe0 pc=0x46c0e1
created by github.com/containerd/ttrpc.(*serverConn).run in goroutine 18
        external/com_github_containerd_ttrpc/server.go:332 +0x336

goroutine 21 [GC worker (idle)]:
runtime.gopark(0x8f4d6a15645d9?, 0x2?, 0x10?, 0x2?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005a750 sp=0xc00005a730 pc=0x43bf4e
runtime.gcBgMarkWorker()
        GOROOT/src/runtime/mgc.go:1293 +0xe5 fp=0xc00005a7e0 sp=0xc00005a750 pc=0x41ea85
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005a7e8 sp=0xc00005a7e0 pc=0x46c0e1
created by runtime.gcBgMarkStartWorkers in goroutine 20
        GOROOT/src/runtime/mgc.go:1217 +0x1c

goroutine 22 [GC worker (idle)]:
runtime.gopark(0x8f4d6a1564790?, 0x2?, 0x55?, 0x1?, 0x0?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc00005af50 sp=0xc00005af30 pc=0x43bf4e
runtime.gcBgMarkWorker()
        GOROOT/src/runtime/mgc.go:1293 +0xe5 fp=0xc00005afe0 sp=0xc00005af50 pc=0x41ea85
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00005afe8 sp=0xc00005afe0 pc=0x46c0e1
created by runtime.gcBgMarkStartWorkers in goroutine 20
        GOROOT/src/runtime/mgc.go:1217 +0x1c

goroutine 1563765 [select, 66 minutes]:
runtime.gopark(0xc0001acfb0?, 0x2?, 0x50?, 0xce?, 0xc0001acf6c?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001ace10 sp=0xc0001acdf0 pc=0x43bf4e
runtime.selectgo(0xc0001acfb0, 0xc0001acf68, 0xc97d00?, 0x0, 0xd54940?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc0001acf30 sp=0xc0001ace10 pc=0x44bac5
github.com/containerd/ttrpc.(*Client).run.func1()
        external/com_github_containerd_ttrpc/client.go:265 +0xb6 fp=0xc0001acfe0 sp=0xc0001acf30 pc=0x6721b6
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001acfe8 sp=0xc0001acfe0 pc=0x46c0e1
created by github.com/containerd/ttrpc.(*Client).run in goroutine 1563764
        external/com_github_containerd_ttrpc/client.go:262 +0xf7

goroutine 44 [syscall, 33307 minutes]:
syscall.Syscall(0x0?, 0x0?, 0xc0000dedd0?, 0x4377db?)
        GOROOT/src/syscall/syscall_linux.go:69 +0x25 fp=0xc00008ec18 sp=0xc00008eba8 pc=0x483725
syscall.read(0xc00005bc80?, {0xc00008ed68?, 0x0?, 0x15bcf00?})
        GOROOT/src/syscall/zsyscall_linux_amd64.go:721 +0x38 fp=0xc00008ec58 sp=0xc00008ec18 pc=0x481698
syscall.Read(...)
        GOROOT/src/syscall/syscall_unix.go:181
github.com/containerd/cgroups/v2.(*Manager).waitForEvents(0xc00007e0e0, 0x70f9a7?, 0x0?)
        external/com_github_containerd_cgroups/v2/manager.go:587 +0x176 fp=0xc00008efb8 sp=0xc00008ec58 pc=0x830f96
github.com/containerd/cgroups/v2.(*Manager).EventChan.func1()
        external/com_github_containerd_cgroups/v2/manager.go:569 +0x28 fp=0xc00008efe0 sp=0xc00008efb8 pc=0x830de8
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc00008efe8 sp=0xc00008efe0 pc=0x46c0e1
created by github.com/containerd/cgroups/v2.(*Manager).EventChan in goroutine 20
        external/com_github_containerd_cgroups/v2/manager.go:569 +0x9d

goroutine 45 [select, 33307 minutes]:
runtime.gopark(0xc000059768?, 0x2?, 0x0?, 0x0?, 0xc0000596f4?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc000059590 sp=0xc000059570 pc=0x43bf4e
runtime.selectgo(0xc000059768, 0xc0000596f0, 0x0?, 0x0, 0x0?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc0000596b0 sp=0xc000059590 pc=0x44bac5
gvisor.dev/gvisor/pkg/shim.(*watcherV2).add.func1()
        pkg/shim/oom_v2.go:101 +0x14e fp=0xc0000597e0 sp=0xc0000596b0 pc=0xb998ce
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0000597e8 sp=0xc0000597e0 pc=0x46c0e1
created by gvisor.dev/gvisor/pkg/shim.(*watcherV2).add in goroutine 20
        pkg/shim/oom_v2.go:98 +0xcd

goroutine 1563766 [IO wait, 66 minutes]:
runtime.gopark(0xdebd30?, 0xb?, 0x0?, 0x0?, 0x1c?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001a9ba0 sp=0xc0001a9b80 pc=0x43bf4e
runtime.netpollblock(0x481698?, 0x407e46?, 0x0?)
        GOROOT/src/runtime/netpoll.go:564 +0xf7 fp=0xc0001a9bd8 sp=0xc0001a9ba0 pc=0x434997
internal/poll.runtime_pollWait(0x7f6ca55292e0, 0x72)
        GOROOT/src/runtime/netpoll.go:343 +0x85 fp=0xc0001a9bf8 sp=0xc0001a9bd8 pc=0x466705
internal/poll.(*pollDesc).wait(0xc0001d6480?, 0xc0002a7000?, 0x0)
        GOROOT/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc0001a9c20 sp=0xc0001a9bf8 pc=0x4db207
internal/poll.(*pollDesc).waitRead(...)
        GOROOT/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc0001d6480, {0xc0002a7000, 0x1000, 0x1000})
        GOROOT/src/internal/poll/fd_unix.go:164 +0x27a fp=0xc0001a9cb8 sp=0xc0001a9c20 pc=0x4dc4fa
net.(*netFD).Read(0xc0001d6480, {0xc0002a7000?, 0xc0001a9d78?, 0x410285?})
        GOROOT/src/net/fd_posix.go:55 +0x25 fp=0xc0001a9d00 sp=0xc0001a9cb8 pc=0x531b05
net.(*conn).Read(0xc000060200, {0xc0002a7000?, 0x410285?, 0xc00024c468?})
        GOROOT/src/net/net.go:179 +0x45 fp=0xc0001a9d48 sp=0xc0001a9d00 pc=0x53ea45
net.(*UnixConn).Read(0xc05ce0?, {0xc0002a7000?, 0x7f6ca552ead0?, 0x7f6cec0ba5b8?})
        <autogenerated>:1 +0x25 fp=0xc0001a9d78 sp=0xc0001a9d48 pc=0x550da5
bufio.(*Reader).Read(0xc000189c20, {0xc00009fc20, 0xa, 0x408ebe?})
        GOROOT/src/bufio/bufio.go:244 +0x197 fp=0xc0001a9db0 sp=0xc0001a9d78 pc=0x50af97
io.ReadAtLeast({0xecc720, 0xc000189c20}, {0xc00009fc20, 0xa, 0xa}, 0xa)
        GOROOT/src/io/io.go:335 +0x90 fp=0xc0001a9df8 sp=0xc0001a9db0 pc=0x4d59b0
io.ReadFull(...)
        GOROOT/src/io/io.go:354
github.com/containerd/ttrpc.readMessageHeader({0xc00009fc20, 0xa, 0xc0001a9e80?}, {0xecc720?, 0xc000189c20?})
        external/com_github_containerd_ttrpc/channel.go:53 +0x4f fp=0xc0001a9e38 sp=0xc0001a9df8 pc=0x66fd4f
github.com/containerd/ttrpc.(*channel).recv(0xc00009fc00)
        external/com_github_containerd_ttrpc/channel.go:101 +0x45 fp=0xc0001a9ed0 sp=0xc0001a9e38 pc=0x670245
github.com/containerd/ttrpc.(*Client).run.func2()
        external/com_github_containerd_ttrpc/client.go:294 +0xb5 fp=0xc0001a9fe0 sp=0xc0001a9ed0 pc=0x671db5
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001a9fe8 sp=0xc0001a9fe0 pc=0x46c0e1
created by github.com/containerd/ttrpc.(*Client).run in goroutine 1563764
        external/com_github_containerd_ttrpc/client.go:286 +0x15f

goroutine 1563764 [select, 76 minutes]:
runtime.gopark(0xc0001faf90?, 0x2?, 0xc0?, 0x63?, 0xc0001faf64?)
        GOROOT/src/runtime/proc.go:398 +0xce fp=0xc0001fae08 sp=0xc0001fade8 pc=0x43bf4e
runtime.selectgo(0xc0001faf90, 0xc0001faf60, 0xc0001fafd0?, 0x0, 0xc00004600e?, 0x1)
        GOROOT/src/runtime/select.go:327 +0x725 fp=0xc0001faf28 sp=0xc0001fae08 pc=0x44bac5
github.com/containerd/ttrpc.(*Client).run(0xc0001d6680)
        external/com_github_containerd_ttrpc/client.go:330 +0x1ee fp=0xc0001fafc8 sp=0xc0001faf28 pc=0x671c2e
github.com/containerd/ttrpc.NewClient.func2()
        external/com_github_containerd_ttrpc/client.go:94 +0x25 fp=0xc0001fafe0 sp=0xc0001fafc8 pc=0x670945
runtime.goexit()
        src/runtime/asm_amd64.s:1650 +0x1 fp=0xc0001fafe8 sp=0xc0001fafe0 pc=0x46c0e1
created by github.com/containerd/ttrpc.NewClient in goroutine 13
        external/com_github_containerd_ttrpc/client.go:94 +0x1d6

rax    0xca
rbx    0x0
rcx    0x46df03
rdx    0x0
rdi    0x158b9c8
rsi    0x80
rbp    0x7fff6bfe67b0
rsp    0x7fff6bfe6768
r8     0x0
r9     0x0
r10    0x0
r11    0x286
r12    0x442820
r13    0xc00005efa4
r14    0x158b0a0
r15    0x0
rip    0x46df01
rflags 0x286
cs     0x33
fs     0x0
gs     0x0

Sadly, I'm not able to see anything fishy in here. I'm saying this shim leaked because it doesn't have any child processes. That in itself might be me just being wrong.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

I've made some progress on this: I was able to reproduce this sortakinda reliably (not reliable on my dev clusters yet) by force-exceeding the ephemeral-storage in the respective pod. The exec to the pod drops and I can't exec into the pod again but it remains intact because the kubelet can't evict it.

We haven't adjusted the overlay2 setting, so we should be running with overlay2=root:self. Does this ring any bells @ayushr2? I'll share a proper reproducer once I can reliably do it.

Edit: I can cautiously confirm that using overlay2=none fixes the issue. I'll keep poking though.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Interesting, thanks for the investigation!

We have internal tests which test exceeding the ephemeral storage limits in GKE with overlay2=root:self and the pod gets evicted by the kubelet. Happy to look at the reproducer.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

I have been trying to use the following GKE Reproducer:

$ gcloud container clusters create repro --num-nodes=1 --location=us-central1-a --cluster-version=1.28.3-gke.1203001

$ gcloud container node-pools create gvisor --cluster=repro --num-nodes=1 --location=us-central1-a --sandbox=type=gvisor --image-type=cos_containerd --machine-type=e2-standard-2
...
NAME    MACHINE_TYPE   DISK_SIZE_GB  NODE_VERSION
gvisor  e2-standard-2  100           1.28.3-gke.1203001

$ kubectl create -f https://raw.githubusercontent.com/GoogleCloudPlatform/k8s-node-tools/master/gvisor/enable-gvisor-flags.yaml
daemonset.apps/enable-gvisor-flags created

$ cat repro.yaml 
apiVersion: v1
kind: Pod
metadata:
  name: repro
  annotations:
    dev.gvisor.flag.platform: "systrap"
    dev.gvisor.flag.directfs: "false"
spec:
  runtimeClassName: gvisor
  restartPolicy: Always
  containers:
  - name: repro
    image: shlinkio/shlink@sha256:c70cf1b37087581cfcb7963d74d6c13fbee8555a7b10aa4af0493e70ade41202
    env:
    - name: INITIAL_API_KEY
      value: foobar
    - name: DEFAULT_DOMAIN
      value: foo.bar
    resources:
      limits:
        cpu: "1"
        ephemeral-storage: 4G
        memory: 2Gi
      requests:
        cpu: 200m
        ephemeral-storage: 400M
        memory: "858993459"

$ kubectl apply -f repro.yaml
pod/repro created

$ sleep 60 && kubectl exec repro -- dd if=/dev/zero of=./big_file bs=4k iflag=fullblock,count_bytes count=10G
command terminated with exit code 137

$ kubectl get pods
NAME    READY   STATUS                   RESTARTS   AGE
repro   0/1     ContainerStatusUnknown   1          2m

I have only been able to repro the ContainerStatusUnknown and Error cases. Across 12 runs, I got ContainerStatusUnknown 8 times and Error 4 times. But I have not been able to get the Running case (which reproduces your issue). Let me know if I am doing something wrong.

$ kubectl version
Client Version: v1.28.5
Kustomize Version: v5.0.4-0.20230601165947-6ce0bf390ce3
Server Version: v1.28.3-gke.1203001

$ containerd --version
containerd github.com/containerd/containerd 1.7.7 8c087663b0233f6e6e2f4515cee61d49f14746a8

$ /home/containerd/usr/local/sbin/runsc --version
runsc version google-573904262    # This maps to 20231009.0_RC01
spec: 1.1.0-rc.2

Let me try the deployment...

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

No luck with the deployment OR using the latest runsc from master.

FWIW, both /run/containerd/io.containerd.runtime.v2.task/k8s.io/5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f/rootfs and /run/containerd/io.containerd.runtime.v2.task/k8s.io/90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8/rootfs are empty on ls -l

@markusthoemmes Could you check the stuck container's rootfs with ls -al? Because with --overlay2=root:self, we create an hidden file named .gvisor.filestore.{sandbox-ID}. So ls -l by itself will not show this file.

This file is deleted when the container is destroyed. From your logs above, you can see that the sandbox is still holding an FD to this file:

lrwx------ 1 nobody nogroup 64 Jan  3 18:23 47 -> '/run/containerd/io.containerd.runtime.v2.task/k8s.io/5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f/rootfs/.gvisor.filestore.90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 (deleted)'

It shows that the file is deleted (which means ls -al may also show nothing, but need to confirm, so we know if Container.Destroy() code was executed). However, the file will only be "released" in the host kernel when all open FDs on it are closed. Since sandbox is holding an FD on it, rootfs can not be umounted (hence EBUSY error). So maybe we are leaking this filestore FD somewhere, let me look more closely.

Meanwhile, can you try reproducing with --overlay2=none?

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Actually, the other FDs shown in this comment are also from the container rootfs. They don't have the /run/containerd/io.containerd.runtime.v2.task/k8s.io/5b6ae772c71b57b0a00297b775a1900bcd30fcd9a6c8ccca2e49573829ae636f/rootfs/ path prefix because they were opened after the rootfs was pivot_root(2)-ed.

The filestore FD is closed when the sentry overlayfs mount (and hence the tmpfs upper layer) are unmounted inside the sentry. Given this FD is still open, it means that the rootfs mount is not being unmounted inside the sentry (hence leaking all these host FDs).

So I don't think --overlay2=none will fix this issue, but could you still please give it a shot.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Edit: I can cautiously confirm that using overlay2=none fixes the issue. I'll keep poking though.

Oh I missed this edit. So overlay2=none does fix the issue. Interesting...

The presence of the .gvisor.filestore.90022bca840ef2fac36200ae8f3511800e883f07558b146290c3b6a9822090b8 FD along with other rootfs file FDs indicates that the overlayfs mount was not unmounted.

The fact that the filestore file is marked (deleted) shows that Container.Destroy() was called and it deleted the filestore file.

But apparently this issue does not happen with overlay2=none. So this is a sentry overlay mount specific problem? Maybe some refer counting issue which is preventing a sentry overlay mount from being released?

@markusthoemmes While reproducing this comment, were you running any other containers in the pod (apart from the pause container)? Or was the stuck container the only one? I am trying to understand if the other FDs to files like /usr/local/lib/libpython3.8.so.1.0 are coming from another container, or are they from the rootfs of the impacted container.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

I can confirm that Destroy must've been called on the container (not the sandbox) because the state file is gone as well (see all the errors about loading the container once more). I can also confirm from another hang that the filestore file is gone from the rootfs (albeit an fd still being held by the sandbox).

Oh I missed this edit. So overlay2=none does fix the issue. Interesting...

This is very anecdotal. The reproducer is so unreliably that I just can't tell for sure, sadly. I've been trying to get it reproduced with full debug logging enabled to get us closer...

@markusthoemmes While reproducing #9834 (comment), were you running any other containers in the pod (apart from the pause container)? Or was the stuck container the only one? I am trying to understand if the other FDs to files like /usr/local/lib/libpython3.8.so.1.0 are coming from another container, or are they from the rootfs of the impacted container.

It was the only container in the pod.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

For some reason, my reproducer isn't reproducing for me anymore either. 😢

from gvisor.

zpavlinovic avatar zpavlinovic commented on August 28, 2024

FWIW, we keep seeing this with our server on Google Cloud Run. Each request executes runsc and this seems to accumulate exe processes until the limit is reached. Each subsequent request then fails.

from gvisor.

zpavlinovic avatar zpavlinovic commented on August 28, 2024

I've tried -overlay2=none and it didn't help unfortunately.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@zpavlinovic are you seeing the exact same issues with failures to unmount the respective container's rootfs?

from gvisor.

zpavlinovic avatar zpavlinovic commented on August 28, 2024

I am seeing the same issues I've been seeing before. They manifest as extra processes that just keep piling up. I am not sure if the underlying issue is ...failures to unmount the respective container's rootfs.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

So sadly, we're still seeing containers getting stuck even with overlay2=none. The symptoms have changed though:

sandbox log

Feb 08 23:42:29 cool-machine containerd[600]: time="2024-02-08T23:42:29.831668612Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea pid=3497130
Feb 08 23:42:30 cool-machine containerd[600]: time="2024-02-08T23:42:30.164013258Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:<redacted>,Uid:d3b3f340-c9ea-4ae7-95f7-049a0e978a88,Namespace:<redacted>,Attempt:0,} returns sandbox id \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\""
Feb 08 23:42:30 cool-machine containerd[600]: time="2024-02-08T23:42:30.166843535Z" level=info msg="CreateContainer within sandbox \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" for container &ContainerMetadata{Name:tcp-sack-disable,Attempt:0,}"
Feb 08 23:42:30 cool-machine containerd[600]: time="2024-02-08T23:42:30.196620372Z" level=info msg="CreateContainer within sandbox \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" for &ContainerMetadata{Name:tcp-sack-disable,Attempt:0,} returns container id \"57f445643a570a67aafb0e7577e57cd00718f3d98519f38980f2b8953e80095d\""
Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.287882512Z" level=info msg="CreateContainer within sandbox \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" for container &ContainerMetadata{Name:<redacted>,Attempt:0,}"
Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.340592681Z" level=info msg="CreateContainer within sandbox \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" for &ContainerMetadata{Name:<redacted>,Attempt:0,} returns container id \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""
Feb 09 07:38:08 cool-machine containerd[600]: time="2024-02-09T07:38:08.781992867Z" level=info msg="StopPodSandbox for \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\""
Feb 09 07:53:08 cool-machine containerd[600]: time="2024-02-09T07:53:08.781887264Z" level=error msg="StopPodSandbox for \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\": an error occurs during waiting for container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" to be killed: wait container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\": context deadline exceeded"

container log

Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.340592681Z" level=info msg="CreateContainer within sandbox \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" for &ContainerMetadata{Name:<redacted>,Attempt:0,} returns container id \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""
Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.341050193Z" level=info msg="StartContainer for \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""
Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.407918674Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a pid=3497394
Feb 08 23:42:36 cool-machine containerd[600]: time="2024-02-08T23:42:36.609419248Z" level=info msg="StartContainer for \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" returns successfully"
Feb 09 07:22:38 cool-machine containerd[600]: time="2024-02-09T07:22:38.780844393Z" level=info msg="StopContainer for \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" with timeout 30 (s)"
Feb 09 07:22:38 cool-machine containerd[600]: time="2024-02-09T07:22:38.847858965Z" level=info msg="Stop container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" with signal terminated"
Feb 09 07:23:09 cool-machine containerd[600]: time="2024-02-09T07:23:09.054139008Z" level=info msg="Kill container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""
Feb 09 07:38:08 cool-machine containerd[600]: time="2024-02-09T07:38:08.782507413Z" level=error msg="StopContainer for \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" failed" error="rpc error: code = Canceled desc = an error occurs during waiting for container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" to be killed: wait container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\": context canceled"
Feb 09 07:38:08 cool-machine containerd[600]: time="2024-02-09T07:38:08.955534683Z" level=info msg="Kill container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""
Feb 09 07:53:08 cool-machine containerd[600]: time="2024-02-09T07:53:08.781887264Z" level=error msg="StopPodSandbox for \"a48cccb9b39cbb5ff4714c306220b9b1a1cae26b0614428e1586ec215e06c7ea\" failed" error="rpc error: code = DeadlineExceeded desc = failed to stop container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\": an error occurs during waiting for container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" to be killed: wait container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\": context deadline exceeded"
Feb 09 07:53:09 cool-machine containerd[600]: time="2024-02-09T07:53:09.416769254Z" level=info msg="StopContainer for \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" with timeout 30 (s)"
Feb 09 07:53:09 cool-machine containerd[600]: time="2024-02-09T07:53:09.544822952Z" level=info msg="Skipping the sending of signal terminated to container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\" because a prior stop with timeout>0 request already sent the signal"
Feb 09 07:53:39 cool-machine containerd[600]: time="2024-02-09T07:53:39.545242581Z" level=info msg="Kill container \"5fd3a3f0085642be111e32e87fc231e41a75b257aa61714e16b543af1bc41a8a\""

It is worth noting that the metrics of those hanging containers spike up to and stick to 100% CPU. This happens before the container is even asked to shut down, suggesting to me that something goes wrong and breaks before that happens and that trying to terminate the pod just surfaces the symptom more clearly.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

We've had another instance of the process reporting 100% CPU load consistently after a certain point and know that the process literally did nothing (apart from listening to a socket, hosting an HTTP server... the usual stuff).

To me, this increasingly looks like something's getting tripped in gVisor at runtime and this not actually being a problem exclusively at deletion time, which makes it even more concerning and harder to detect reliably.

@ayushr2 does this ring any bells to you in terms of what the underlying issue could be?

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Nope, nothing stands out immediately. Hard to investigate without a reproducer. I have a few suggestions:

  • Could you try flag --ref-leak-mode=panic. If this has something to do with leaking references, it will panic the sentry and boot logs will show what leaked. Recently @avagin fixed a bunch of ref leak bugs and enabled it for tests: 5b33e4a. So you might need to use the latest build.
  • Could you try profiling? Use --profile. Once you see that the sandbox is reporting 100% CPU usage, you can grab a CPU profile and upload it here. So we get an idea about what the sentry is stuck on. See "Profiling" section on our debugging guide. For the --root flag, you might need to look at the log files to see what the RootDir is. It is printed somewhere on the top.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

I fear I can't enable profiling as we're only somewhat reliably see this in prod scale. I'll see what I can do on that front.

FWIW, it seems like the CPU usage comes from the exe processes under the sandbox. Does that mean the program itself is using that CPU or is that the systrap code intercepting the calls that's using it? The respective processes I've seen using the CPU in htop sat at the 3rd level below the sandbox, if that tells you anything.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

FWIW, I don't know if overlay2=none had an impact here or not, but since doing that, all hanging containers have also shown the symptoms of #10000.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Now that #10000 is resolved, could you try removing --overlay2=none (so it goes back to default) and let us know if you are still having this issue with hanging containers? Maybe it was the same underlying systrap bug, but with --overlay2=root:self it showed different top-level symptoms.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@ayushr2 that's precisely what we were thinking as well and we're going to do that. I'll let you know and/or close this issue if we don't see this come back.

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

@ayushr2 finally gotten around to do this. Sadly, we still see pods hanging on deletion. The error symptoms seem to have remained identical.

from gvisor.

ayushr2 avatar ayushr2 commented on August 28, 2024

Hi @markusthoemmes,

Sorry for the delay. I read through this issue again.

Sadly, we still see pods hanging on deletion. The error symptoms seem to have remained identical.

Is the error still failed rootfs umount: failed to unmount target /run/containerd/io.containerd.runtime.v2.task/k8s.io/<container-id>/rootfs: device or resource busy? You had mentioned that the symptoms have changed since rolling out --overlay2=none. But those symptoms were resolved in #10000. So are we back at the rootfs umount issue? And to clarify again, you are running with --directfs=false --overlay2=none right? And neither of them help?

If that's the case, it appears that the sandbox is not releasing the rootfs fsimpl/gofer mount inside the sentry when the container is supposedly killed.

We also know that Container.Destroy() was called, which is probably coming from a runsc delete invocation from here:

if err := c.Destroy(); err != nil {
return fmt.Errorf("destroying container: %v", err)
}

Note that Container.Destroy() only cleans up the container state from outside sandbox (like cleaning up statefile, filestore files, etc). But the sentry-side cleanup of the container is not triggered from here. The sentry-side cleanup happens when all application tasks of the container exit and the namespaces are released.

Could you pass --debug-log=/tmp/logs/ flag to runsc (any directory is fine, but the trailing slash is important). You don't need to pass --debug or --strace unless you are doing this in a dev/test cluster, since those will degrade performance. That directory will be populated with a log file for each runsc command being run. So we will be able to see if runsc delete is succeeding and a subsequent runsc wait hanging/failing/timing out and relevant error messages. If you are able to get a repro and collect such logs, please upload them here.

Also, since this (the stuck container) is the only container in the pod (apart from the pause container), when you hit this state, can you just delete the sandbox itself? Or is that hung too?

from gvisor.

markusthoemmes avatar markusthoemmes commented on August 28, 2024

Yes we're running with --directfs=false --overlay2=none currently, and that is stable from what we can tell.

Switching --overlay2 back to the default causes significant hangs again. They are in the failed to unmount territory again indeed.

from gvisor.

frezbo avatar frezbo commented on August 28, 2024

I'm facing the same issue, attaching the pod status and the debug logs:

❯ kubectl describe pod nginx-gvisor
Name:                      nginx-gvisor
Namespace:                 default
Priority:                  0
Runtime Class Name:        gvisor
Service Account:           default
Node:                      talos-default-worker-1/10.5.0.3
Start Time:                Mon, 24 Jun 2024 20:35:01 +0530
Labels:                    <none>
Annotations:               <none>
Status:                    Terminating (lasts 114s)
Termination Grace Period:  30s
IP:                        10.244.1.4
IPs:
  IP:  10.244.1.4
Containers:
  nginx-gvisor:
    Container ID:   containerd://d540bfeb56dde5fcc725327c30cde596da988da6d485eaee4e25dbd857215def
    Image:          nginx
    Image ID:       docker.io/library/nginx@sha256:9c367186df9a6b18c6735357b8eb7f407347e84aea09beb184961cb83543d46e
    Port:           <none>
    Host Port:      <none>
    State:          Terminated
      Reason:       Completed
      Exit Code:    0
      Started:      Mon, 24 Jun 2024 20:35:25 +0530
      Finished:     Mon, 24 Jun 2024 20:35:27 +0530
    Ready:          False
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-6pshn (ro)
Conditions:
  Type                        Status
  PodReadyToStartContainers   True 
  Initialized                 True 
  Ready                       False 
  ContainersReady             False 
  PodScheduled                True 
Volumes:
  kube-api-access-6pshn:
    Type:                    Projected (a volume that contains injected data from multiple sources)
    TokenExpirationSeconds:  3607
    ConfigMapName:           kube-root-ca.crt
    ConfigMapOptional:       <nil>
    DownwardAPI:             true
QoS Class:                   BestEffort
Node-Selectors:              <none>
Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
  Type     Reason         Age    From               Message
  ----     ------         ----   ----               -------
  Normal   Scheduled      2m48s  default-scheduler  Successfully assigned default/nginx-gvisor to talos-default-worker-1
  Normal   Pulling        2m47s  kubelet            Pulling image "nginx"
  Normal   Pulled         2m24s  kubelet            Successfully pulled image "nginx" in 23.165s (23.165s including waiting). Image size: 71010466 bytes.
  Normal   Created        2m24s  kubelet            Created container nginx-gvisor
  Normal   Started        2m24s  kubelet            Started container nginx-gvisor
  Normal   Killing        2m22s  kubelet            Stopping container nginx-gvisor
  Warning  FailedKillPod  21s    kubelet            error killing pod: failed to "KillPodSandbox" for "1a9c79cf-ef28-4737-86e2-ca0b19238cfb" with KillPodSandboxError: "rpc error: code = DeadlineExceeded desc = context deadline exceeded"

Debug logs:
nginx.tar.gz

from gvisor.

frezbo avatar frezbo commented on August 28, 2024

Not sure why there is two container id's in runsc folder, clearly kubelet shows the id as d540bfeb56dde5fcc725327c30cde596da988da6d485eaee4e25dbd857215def but the runsc logs folder have both d540bfeb56dde5fcc725327c30cde596da988da6d485eaee4e25dbd857215def and 931f18f6afa6c9987818e93fa7ffcef8e22217631b2580b6232c9d06c6f09b48 (this was a fresh cluster with only the nginx pod launched with gvisor)

from gvisor.

avagin avatar avagin commented on August 28, 2024

931f18f6afa6c9987818e93fa7ffcef8e22217631b2580b6232c9d06c6f09b48 is the pod root container.
d540bfeb56dde5fcc725327c30cde596da988da6d485eaee4e25dbd857215def is the nginx container inside the pod.

In the shim log, I see that the root container was killed, but I don't see this signal in the boot log:

time="2024-06-24T15:05:28.157070305Z" level=debug msg="Kill, id: 931f18f6afa6c9987818e93fa7ffcef8e22217631b2580b6232c9d06c6f09b48, execID: , signal: 9, all: false"
time="2024-06-24T15:05:28.157096244Z" level=debug msg="Kill succeeded"

from gvisor.

frezbo avatar frezbo commented on August 28, 2024

Let me know if you need any more logs or something to test out, i'll try testing with a different image to see if it's an issue with nginx image.

from gvisor.

frezbo avatar frezbo commented on August 28, 2024

I've tested with other images and the issue still persists, updating gvisor has no effect, the only thing different could be the host is running containerd v2

from gvisor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.