You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there a way to cleanly halt execution when a large job is running? I am using this in a jupyter notebook on macOS 10.15.4 and when I interrupt the cell (or one of the processes exits with an error) the cell shows as finished but my CPU and memory are still being used. In fact, there appears to be a memory leak because after exiting the whole jupyter and python instance, the memory usage of "kernel_task" (the process which was showing high CPU during execution) does not drop.
The text was updated successfully, but these errors were encountered:
Is there a way to cleanly halt execution when a large job is running? I am using this in a jupyter notebook on macOS 10.15.4 and when I interrupt the cell (or one of the processes exits with an error) the cell shows as finished but my CPU and memory are still being used. In fact, there appears to be a memory leak because after exiting the whole jupyter and python instance, the memory usage of "kernel_task" (the process which was showing high CPU during execution) does not drop.
The text was updated successfully, but these errors were encountered: