"We should have some ways of connecting programs like garden hose--screw in another segment when it becomes when it becomes necessary to massage data in another way. This is the way of IO also."
Streams come to us from the earliest days of unix and have proven themselves
over the decades as a dependable way to compose large systems out of small
do one thing well.
In unix, streams are implemented by the shell with
In node, the built-in
is used by the core libraries and easily be used by user-land code.
Similar to unix, the node stream module's primary composition operator is called
Streams can be useful because they restrict the implementation surface area into a single consistent interface. You can then easily plug the output of one stream to the input of another and use libraries that operate abstractly on streams to institute higher-level flow control.
Streams are an important component of small-program design and unix philosophy but there are many other important abstractions worth considering. Just remember that technical debt is the enemy and seek the best abstractions for the problem at hand.
pause / resume / drain
These streams are built into node itself.
This readable stream contains the standard system input stream for your program.
It is paused by default but the first time you refer to it
.resume() will be
called implicitly on the
If process.stdin is a tty (check with
then input events will be line-buffered. You can turn off line-buffering by
process.stdin.setRawMode(true) BUT the default handlers for key
combinations such as
^D will be removed.
This function returns a [duplex stream] that connects over tcp to a remote host.
You can start writing to the stream right away and the writes will be buffered
'connect' event fires.
Use this module to parse and stringify json data from streams.
If you need to pass a large json collection through a slow connection or you have a json object that will populate slowly this module will let you parse data incrementally as it arrives.