Skip to content


segfaults/OOM when hitting a process with too many signals #356

nocode opened this Issue · 1 comment

2 participants


The following code behaves erratically when hitting a process
a lot of signals. I've seen it eat several gigs of memory
and I've also seen it segfault once or twice, too. Sometimes it runs fine, too I couldn't
get any behavior consistently, but insane memory growth seem to be the most common.

r, w = IO.pipe
pid = fork do
  sr, sw = IO.pipe
  trap(:TERM) { exit(0) }
  trap(:HUP) { sw.write_nonblock('.') rescue nil }
  loop { sr.readpartial(16384) }

p [ :child_ready,  r.sysread(1) ]
100000000.times { Process.kill(:HUP, pid) }
Process.kill(:TERM, pid)
p Process.waitpid2(pid)
Rubinius member

Rework how signals are delivered. Closed by e4c92f8

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.