Skip to content

h2non/pipefy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

pipefy Build Status

Simple and dependency-free node/io.js module to transform a function into a pipeable stream

pipefy returns a unique buffer as result of contatenating each chunk emitted by the readable input stream. This is usually enough for most cases, but for large amounts of data handling huge buffers may have negative performance side-effects.

Installation

npm install pipefy --save

Example

var fs = require('fs')
var pipefy = require('pipefy')

Instead of doing this (note that I've used the sync API for simplification):

function process(buf, path) {
  // mad science here...
  fs.writeFileSync(path, buf)
}

var data = fs.readFileSync('image.jpg')
process(data, 'new.jpg')

With pipefy you can do the same in a more idiomatic and efficient way:

function process(buf, path) {
  // mad science here...
  fs.writeFileSync(path, buf)
}

fs.createReadStream('image.jpg')
  .pipe(pipefy(process, 'new.jpg'))

API

pipefy(fn, [ args... ])

Returns a WritableStream

You can subscribe to stream events to deal with the status. E.g: error, finish ...

pipefy.Stream()

Writable stream implementation used internally by pipefy

See the implementation for hacking purposes

License

MIT - Tomas Aparicio

About

Transform a function into a pipeable writable stream in node/io.js

Resources

License

Stars

Watchers

Forks

Packages

No packages published