Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
Browse files

inital. first implementation of a map function (takes async callback …

…function and turns it into readible & writable stream
  • Loading branch information...
commit 970993ccc594c606d439a9fa22715d311e1b9fa2 0 parents
Dominic Tarr authored
Showing with 103 additions and 0 deletions.
  1. +3 −0  .gitignore
  2. +68 −0 index.js
  3. +11 −0 package.json
  4. +21 −0 readme.markdown
3  .gitignore
@@ -0,0 +1,3 @@
+node_modules
+node_modules/*
+npm_debug.log
68 index.js
@@ -0,0 +1,68 @@
+
+//create an event stream and apply function to each .write
+//emitting each response as data
+//unless it's an empty callback
+//
+
+var Stream = require('stream').Stream
+
+exports.map = function (mapper) {
+ var stream = new Stream()
+ , inputs = 0
+ , outputs = 0
+ , ended = false
+
+ stream.writable = true
+ stream.readible = true
+
+ stream.write = function () {
+ inputs ++
+ var args = [].slice.call(arguments)
+ , r
+ , inNext = false
+ function next (err) {
+ inNext = true
+ outputs ++
+ var args = [].slice.call(arguments)
+ if(err)
+ return inNext = false, stream.emit('error')
+
+ args.shift() //drop err
+
+ if (args.length){
+ args.unshift('data')
+ r = stream.emit.apply(stream, args)
+ }
+ if(inputs == outputs) {
+ stream.emit('drain') //written all the incoming events
+ if(ended)
+ stream.end()
+ }
+ inNext = false
+ }
+ args.push(next)
+
+ try {
+ //catch sync errors and handle them like async errors
+ return mapper.apply(null,args)
+ } catch (err) {
+ //if the callback has been called syncronously, and the error
+ //has occured in an listener, throw it again.
+ if(inNext)
+ throw err
+ next(err)
+ }
+ }
+
+ stream.end = function () {
+ var args = [].slice.call(arguments)
+ //if end was called with args, write it,
+ ended = true //write will emit 'end' if ended is true
+ if(args.length)
+ return stream.write.apply(emitter, args)
+ else
+ stream.emit('end')
+ }
+
+ return stream
+}
11 package.json
@@ -0,0 +1,11 @@
+{ "name": "event-stream"
+, "version": "0.0.0"
+, "description": "node streams applied to arbitary events"
+, "homepage": "http://github.com/dominictarr/event-stream"
+, "repository":
+ { "type": "git"
+ , "url": "https://github.com/dominictarr/event-stream.git" }
+, "dependencies": {}
+, "devDependencies": {}
+, "author": "Dominic Tarr <dominic.tarr@gmail.com> (http://bit.ly/dominictarr)"
+, "scripts": { "test": "meta-test test/*.js" } }
21 readme.markdown
@@ -0,0 +1,21 @@
+# EventStreams
+
+EventEmitters in node are a brilliant idea that are under utilized in the node community.
+
+they are at the heart of all the IO in node core, reading from files, or TCP, or the body
+of an http request or response is handled as a stream of events.
+
+A stream of events is a bit like an array, but an array layed out in time, rather than in memory.
+
+You can apply a map function to an array and create a new array, you could apply a similar
+map function to a stream of events to create a new stream. `map` functions, but also `fitler`, `reduce`
+and other functional programming idioms!
+
+event streams are great because they have a naturally scalable API.
+if the events in a stream can be stringified ane parsed then it will be relatively simple to split heavy
+parts of a stream into seperate processes, and to incorperate middlewares into the stream that might
+buffer or rate-limit or parallelize critical aspects of your event stream.
+
+Supporting this sort of programming is the purpose of this library.
+
+an EventStream must:
Please sign in to comment.
Something went wrong with that request. Please try again.