Skip to content
This repository has been archived by the owner on Jul 12, 2022. It is now read-only.

Prevent external processes from hanging forever #1044

Merged
merged 1 commit into from
Nov 21, 2020

Conversation

CrispyDrone
Copy link
Contributor

✨ What kind of change does this PR introduce? (Bug fix, feature, docs update...)

Bug fix.

⤵️ What is the current behavior?

In certain cases, invoking an external process will result in NuKeeper hanging forever waiting for it to terminate. This can reliably be reproduced (in my case), when debugging NuKeeper. When invoking NuKeeper from the command line (without a debugger attached?), this does not happen.

💥 Does this PR introduce a breaking change?

No.

🐛 Recommendations for testing

Try debugging NuKeeper. For me this consistently hangs on the first restore solution command. This also happens when running it from bash. It does not happen when I run from the windows cmd console (perhaps after invoking VsDevCmd.bat, since I've been doing that to prevent other problems).

📝 Links to relevant issues/docs

🤔 Checklist before submitting

No relevant documentation, I think.

  • All projects build
  • Relevant documentation was updated

It is important to read both stdout and stderr lest their buffers fill
up, and the process sits there waiting for eternity.

I don't know if this is specific to how a process interacts with its
console handles, or whether this is determined by the OS.
@scp-mb
Copy link

scp-mb commented Oct 8, 2020

Ran into this issue today when we had some unit test projects that failed to restore packages, so stdout was written to first, which would cause our nukeeper CI pipeline to hang until the 1 hour timeout.

@skolima
Copy link
Collaborator

skolima commented Nov 21, 2020

I've intermittently seen the error myself (though never managed to reproduce), but I'm confused how this change would fix it. Task.WhenAll requires tasks to finish without regard to the ordering, but still requires both of them to finish. How does this resolve the issue of hanged behaviour?

And could you please include a test for this change?

@Greybird
Copy link
Contributor

Greybird commented Nov 21, 2020

I've intermittently seen the error myself (though never managed to reproduce), but I'm confused how this change would fix it. Task.WhenAll requires tasks to finish without regard to the ordering, but still requires both of them to finish. How does this resolve the issue of hanged behaviour?

I think the key thing might be that both tasks will be started before waiting for one to complete.

The previous code is waiting to read all the output stream, then waits for the error stream to be completely read.
These streams have finite size, and the child executable could have filled the error stream, waiting for it be have some room to write more.
However, as the parent process will be waiting to have finished reading the output stream (which is not the case as the child process could write to it afterwards), before reading the error stream.
Result is that both processes are deadlocked.

You can find some details about these caveats of these streams in the ms doc, where in the last example, the error stream is read using the asynchronous method (ie events) while the output is read synchronoulsy (these synchronous/asynchronous not to be confused with async/await usage)

@skolima
Copy link
Collaborator

skolima commented Nov 21, 2020

This (and the MSDN docs) make sense now. Thanks.

@skolima skolima merged commit 59b39d5 into NuKeeperDotNet:master Nov 21, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants