Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

segfaults/OOM when hitting a process with too many signals #356

Closed
nocode opened this Issue · 1 comment

2 participants

@nocode

The following code behaves erratically when hitting a process
a lot of signals. I've seen it eat several gigs of memory
and I've also seen it segfault once or twice, too. Sometimes it runs fine, too I couldn't
get any behavior consistently, but insane memory growth seem to be the most common.

r, w = IO.pipe
pid = fork do
  sr, sw = IO.pipe
  trap(:TERM) { exit(0) }
  trap(:HUP) { sw.write_nonblock('.') rescue nil }
  w.syswrite('.')
  loop { sr.readpartial(16384) }
end

p [ :child_ready,  r.sysread(1) ]
100000000.times { Process.kill(:HUP, pid) }
Process.kill(:TERM, pid)
p Process.waitpid2(pid)
@evanphx
Owner

Rework how signals are delivered. Closed by e4c92f8

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.