Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Browse files

improved doc tool

  • Loading branch information...
commit 509757bc4a083e4494cb6969c1644595be830cbe 1 parent e307835
@bjouhier bjouhier authored
View
431 API.md
@@ -1,415 +1,20 @@
-
-# streamline/lib/callbacks/transform
-
-Streamline's transformation engine
-
-* `transformed = transform.transform(source, options)`
- Transforms streamline source.
- The following `options` may be specified:
- * `tryCatch` controls exception handling
- * `lines` controls line mapping
- * `callback` alternative identifier if `_` is already used.
- * `noHelpers` disables generation of helper functions (`__cb`, etc.)
-
-# streamline built-ins
-
-## Asychronous versions of ES5 Array functions.
-
-Common Rules:
-
-These variants are postfixed by an underscore.
-They take the `_` callback as first parameter.
-They pass the `_` callback as first arguement to their `fn` callback.
-They have an optional `options` second parameter which controls the level of
-parallelism. This `options` parameter may be specified as `{ parallel: par }`
-where par is an integer, or directly as a `par` integer value.
-The `par` values are interpreted as follows:
-
-* If absent or equal to 1, execution is sequential.
-* If > 1, at most `par` operations are parallelized.
-* if 0, a default number of operations are parallelized.
- This default can be read and set with funnel.defaultSize (4 by default)
-* If < 0 or Infinity, operations are fully parallelized (no limit).
-
-API:
-
-* `array.forEach_(_[, options], fn[, thisObj])`
- `fn` is called as `fn(_, elt, i)`.
-* `result = array.map_(_[, options], fn[, thisObj])`
- `fn` is called as `fn(_, elt, i)`.
-* `result = array.filter_(_[, options], fn[, thisObj])`
- `fn` is called as `fn(_, elt)`.
-* `bool = array.every_(_[, options], fn[, thisObj])`
- `fn` is called as `fn(_, elt)`.
-* `bool = array.some_(_[, options], fn[, thisObj])`
- `fn` is called as `fn(_, elt)`.
-* `result = array.reduce_(_, array, fn, val)`
- `fn` is called as `val = fn(_, val, elt, i, array)`.
-* `result = flows.reduceRight(_, array, fn, val, [thisObj])`
- reduces from end to start by applying `fn` to each element.
- `fn` is called as `val = fn(_, val, elt, i, array)`.
-* `array = flows.sort(_, array, compare, [beg], [end])`
- `compare` is called as `cmp = compare(_, elt1, elt2)`
- Note: this function _changes_ the original array (and returns it)
-* `result = fn.apply_(_, thisObj, args, [index])`
- Helper to apply `Function.apply` to streamline functions.
- Equivalent to `result = fn.apply(thisObj, argsWith_)` where `argsWith_` is
- a modified argument list in which the callback has been inserted at `index`
- (at the end of the argument list if `index` is not specified).
-
-# streamline/lib/compiler/compile
-
-Streamline compiler and file loader
-
-* `script = compile.loadFile(_, path, options)`
- Loads Javascript file and transforms it if necessary.
- Returns the transformed source.
- If `path` is `foo_.js`, the source is transformed and the result
- is *not* saved to disk.
- If `path` is `foo.js` and if a `foo_.js` file exists,
- `foo_.js` is transformed if necessary and saved as `foo.js`.
- If `path` is `foo.js` and `foo_.js` does not exist, the contents
- of `foo.js` is returned.
- `options` is a set of options passed to the transformation engine.
- If `options.force` is set, `foo_.js` is transformed even if
- `foo.js` is more recent.
-* `script = compile.transformModule(path, options)`
- Synchronous version of `compile.loadFile`.
- Used by `require` logic.
-* `compile.compile(_, paths, options)`
- Compiles streamline source files in `paths`.
- Generates a `foo.js` file for each `foo._js` file found in `paths`.
- `paths` may be a list of files or a list of directories which
- will be traversed recursively.
- `options` is a set of options for the `transform` operation.
-
-# streamline/lib/globals
-
-The `streamline.lib.globals` is a container for the global `context` object which is maintained across
-asynchronous calls.
-
-This context is very handy to store information that all calls should be able to access
-but that you don't want to pass explicitly via function parameters. The most obvious example is
-the `locale` that each request may set differently and that your low level libraries should
-be able to retrieve to format messages.
-
-* `globals.context = ctx`
-* `ctx = globals.context`
- sets and gets the context
-
-
# streamline
-
-Streamline main API
-
-* `command.run()`
- runs `node-streamline` command line analyzer / dispatcher
-* `register.register(options)`
- Registers `require` handlers for streamline.
- `options` is a set of default options passed to the `transform` function.
-
-# streamline/lib/streams/server/streams
-
-Server Streams module
-
-The `streams` module contains _pull mode_ wrappers around node streams.
-
-These wrappers implement a _pull style_ API.
-Instead of having the stream _push_ the data to its consumer by emitting `data` and `end` events,
-these wrappers let the consumer _pull_ the data from the stream by calling asynchronous `read` methods.
-
-For a bit more background on this design,
-you can read [this blog post](http://bjouhier.wordpress.com/2011/04/25/asynchronous-episode-3-adventures-in-event-land/)
-
-For a simple example of this API in action,
-see the [google client example](./examples/googleClient_.js)
-
-## Emitter
-
-Base wrapper for all objects that emit an `end` or `close` event.
-All stream wrappers derive from this wrapper.
-
-* `wrapper = new streams.Emitter(stream)`
- creates a wrapper.
-* `emitter = wrapper.emitter`
- returns the underlying emitter. The emitter stream can be used to attach additional observers.
-* `closed = wrapper.closed`
- returns true if the `close` event has been received.
-* `emitter = wrapper.unwrap()`
- unwraps and returns the underlying emitter.
- The wrapper should not be used after this call.
-
-## ReadableStream
-
-All readable stream wrappers derive from this wrapper.
-
-* `stream = new streams.ReadableStream(stream, [options])`
- creates a readable stream wrapper.
-* `stream.setEncoding(enc)`
- sets the encoding.
- returns `this` for chaining.
-* `data = stream.read(_, [len])`
- reads asynchronously from the stream and returns a `string` or a `Buffer` depending on the encoding.
- If a `len` argument is passed, the `read` call returns when `len` characters or bytes
- (depending on encoding) have been read, or when the underlying stream has emitted its `end` event.
- Without `len`, the read calls returns the data chunks as they have been emitted by the underlying stream.
- Once the end of stream has been reached, the `read` call returns `null`.
-* `data = stream.readAll(_)`
- reads till the end of stream.
- Equivalent to `stream.read(_, -1)`.
-* `stream.unread(chunk)`
- pushes the chunk back to the stream.
- returns `this` for chaining.
-
-## WritableStream
-
-All writable stream wrappers derive from this wrapper.
-
-* `stream = new streams.WritableStream(stream, [options])`
- creates a writable stream wrapper.
-* `stream.write(_, data, [enc])`
- Writes the data.
- This operation is asynchronous because it _drains_ the stream if necessary.
- If you have a lot of small write operations to perform and you don't want the overhead of draining at every step,
- you can write to the underlying stream with `stream.emitter.write(data)` most of the time
- and call `stream.write(_, data)` once in a while to drain.
- Returns `this` for chaining.
-* `stream.end()`
- signals the end of the send operation.
- Returns `this` for chaining.
-
-## HttpServerRequest
-
-This is a wrapper around node's `http.ServerRequest`:
-This stream is readable (see Readable Stream above).
-
-* `request = new streams.HttpServerRequest(req, [options])`
- returns a wrapper around `req`, an `http.ServerRequest` object.
- The `options` parameter can be used to pass `lowMark` and `highMark` values.
-* `method = request.method`
-* `url = request.url`
-* `headers = request.headers`
-* `trailers = request.trailers`
-* `httpVersion = request.httpVersion`
-* `connection = request.connection`
-* `socket = request.socket`
- (same as `http.ServerRequest`)
-
-## HttpServerResponse
-
-This is a wrapper around node's `http.ServerResponse`.
-This stream is writable (see Writable Stream above).
-
-* `response = new streams.HttpServerResponse(resp, [options])`
- returns a wrapper around `resp`, an `http.ServerResponse` object.
-* `response.writeContinue()`
-* `response.writeHead(head)`
-* `response.setHeader(name, value)`
-* `value = response.getHeader(head)`
-* `response.removeHeader(name)`
-* `response.addTrailers(trailers)`
-* `response.statusCode = value`
- (same as `http.ServerResponse`)
-
-## HttpServer
-
-This is a wrapper around node's `http.Server` object:
-
-* `server = streams.createHttpServer(requestListener, [options])`
- creates the wrapper.
- `requestListener` is called as `requestListener(request, response, _)`
- where `request` and `response` are wrappers around `http.ServerRequest` and `http.ServerResponse`.
-* `server.listen(_, port, [host])`
-* `server.listen(_, path)`
- (same as `http.Server`)
-
-## HttpClientResponse
-
-This is a wrapper around node's `http.ClientResponse`
-
-This stream is readable (see Readable Stream above).
-
-* `response = request.response(_)`
- returns the response stream.
-* `status = response.statusCode`
- returns the HTTP status code.
-* `version = response.httpVersion`
- returns the HTTP version.
-* `headers = response.headers`
- returns the HTTP response headers.
-* `trailers = response.trailers`
- returns the HTTP response trailers.
-* `response.checkStatus(statuses)`
- throws an error if the status is not in the `statuses` array.
- If only one status is expected, it may be passed directly as an integer rather than as an array.
- Returns `this` for chaining.
-
-## HttpClientRequest
-
-This is a wrapper around node's `http.ClientRequest`.
-
-This stream is writable (see Writable Stream above).
-
-* `request = streams.httpRequest(options)`
- creates the wrapper.
- The options are the following:
- * `method`: the HTTP method, `'GET'` by default.
- * `headers`: the HTTP headers.
- * `url`: the requested URL (with query string if necessary).
- * `proxy.url`: the proxy URL.
- * `lowMark` and `highMark`: low and high water mark values for buffering (in bytes or characters depending
- on encoding).
- Note that these values are only hints as the data is received in chunks.
-* `response = request.response(_)`
- returns the response.
-* `request.abort()`
- aborts the request.
-
-## NetStream
-
-This is a wrapper around streams returned by TCP and socket clients:
-
-These streams are both readable and writable (see Readable Stream and Writable Stream above).
-
-* `stream = new streams.NetStream(stream, [options])`
- creates a network stream wrapper.
-
-## TCP and Socket clients
-
-These are wrappers around node's `net.createConnection`:
-
-* `client = streams.tcpClient(port, host, [options])`
- returns a TCP connection client.
-* `client = streams.socketClient(path, [options])`
- returns a socket client.
- The `options` parameter of the constructor provide options for the stream (`lowMark` and `highMark`).
- If you want different options for `read` and `write` operations, you can specify them by creating `options.read` and `options.write` sub-objects inside `options`.
-* `stream = client.connect(_)`
- connects the client and returns a network stream.
-
-## NetServer
-
-This is a wrapper around node's `net.Server` object:
-
-* `server = streams.createNetServer([serverOptions,] connectionListener [, streamOptions])`
- creates the wrapper.
- `connectionListener` is called as `connectionListener(stream, _)`
- where `stream` is a `NetStream` wrapper around the native connection.
-* `server.listen(_, port, [host])`
-* `server.listen(_, path)`
- (same as `net.Server`)
-
-## try/finally wrappers and pump
-
-* `result = streams.using(_, constructor, stream, [options,] fn)`
- wraps `stream` with an instance of `constructor`;
- passes the wrapper to `fn(_, wrapped)` and closes the stream after `fn` returns.
- `fn` is called inside a `try/finally` block to guarantee that the stream is closed in all cases.
- Returns the value returned by `fn`.
-* `result = streams.usingReadable(_, stream, [options,] fn)`
- shortcut for `streams.using(_, streams.ReadableStream, stream, options, fn)`
-* `result = streams.usingWritable(_, stream, [options,] fn)`
- shortcut for `streams.using(_, streams.WritableStream, stream, options, fn)`
-* `streams.pump(_, inStream, outStream)`
- Pumps from `inStream` to `outStream`.
- Does not close the streams at the end.
-
-# streamline/lib/tools/docTool
-
-Documentation tool
-
-Usage:
-
- node streamline/lib/tools/docTool [path]
-
-Extracts documentation comments from `.js` files and generates `API.md` file
-under package root.
-
-Top of source file must contain `/// !doc` marker to enable doc extraction.
-Documentation comments must start with `/// ` (with 1 trailing space).
-Extraction can be turned off with `/// !nodoc` and turned back on with `/// !doc`.
-
-The tool can also be invoked programatically with:
-
-* `doc = docTool.generate(_, path)`
- extracts documentation comments from file `path`
-
-# streamline/lib/util/flows
-
-Flows Module
-
-The `streamline/lib/util/flows` module contains some handy utilities for streamline code
-
-## Array utilities (obsolete)
-
-This API is obsolete. Use `array.forEach_`, `array.map_`, ... instead.
-
-* `flows.each(_, array, fn, [thisObj])`
- applies `fn` sequentially to the elements of `array`.
- `fn` is called as `fn(_, elt, i)`.
-* `result = flows.map(_, array, fn, [thisObj])`
- transforms `array` by applying `fn` to each element in turn.
- `fn` is called as `fn(_, elt, i)`.
-* `result = flows.filter(_, array, fn, [thisObj])`
- generates a new array that only contains the elements that satisfy the `fn` predicate.
- `fn` is called as `fn(_, elt)`.
-* `bool = flows.every(_, array, fn, [thisObj])`
- returns true if `fn` is true on every element (if `array` is empty too).
- `fn` is called as `fn(_, elt)`.
-* `bool = flows.some(_, array, fn, [thisObj])`
- returns true if `fn` is true for at least one element.
- `fn` is called as `fn(_, elt)`.
-* `result = flows.reduce(_, array, fn, val, [thisObj])`
- reduces by applying `fn` to each element.
- `fn` is called as `val = fn(_, val, elt, i, array)`.
-* `result = flows.reduceRight(_, array, fn, val, [thisObj])`
- reduces from end to start by applying `fn` to each element.
- `fn` is called as `val = fn(_, val, elt, i, array)`.
-* `array = flows.sort(_, array, compare, [beg], [end])`
- sorts the array.
- `compare` is called as `cmp = compare(_, elt1, elt2)`
-
- Note: this function _changes_ the original array (and returns it)
-
-## Object utility (obsolete)
-
-This API is obsolete. Use `Object.keys(obj).forEach_` instead.
-
-* `flows.eachKey(_, obj, fn)`
- calls `fn(_, key, obj[key])` for every `key` in `obj`.
-
-## Workflow Utilities
-
-* `fun = flows.funnel(max)`
- limits the number of concurrent executions of a given code block.
-
-The `funnel` function is typically used with the following pattern:
-
-``` javascript
-// somewhere
-var myFunnel = flows.funnel(10); // create a funnel that only allows 10 concurrent executions.
-
-// elsewhere
-myFunnel(_, function(_) { /* code with at most 10 concurrent executions */ });
-```
-
-The `diskUsage2.js` example demonstrates how these calls can be combined to control concurrent execution.
-
-The `funnel` function can also be used to implement critical sections. Just set funnel's `max` parameter to 1.
-
-The funnel can be closed with `fun.close()`.
-When a funnel is closed, the operations that are still in the funnel will continue but their callbacks
-won't be called, and no other operation will enter the funnel.
-
-* `results = flows.collect(_, futures)`
- collects the results of an array of futures
-
-## Miscellaneous
-
-Utility functions:
-* `flows.nextTick(_)`
- `nextTick` function for both browser and server.
- Aliased to `process.nextTick` on the server side.
-* `result = flows.apply(_, fn, thisObj, args, [index])`
- Obsolete. Use `fn.apply_` instead.
+Asynchronous Javascript for dummies
+
+* [streamline/index](index.md)
+ Main API
+* [streamline/lib/callbacks/transform](lib/callbacks/transform.md)
+ Transformation engine (callback mode)
+* [streamline/lib/compiler/builtins](lib/compiler/builtins.md)
+ Streamline built-ins
+* [streamline/lib/compiler/compile](lib/compiler/compile.md)
+ Compiler and file loader
+* [streamline/lib/globals](lib/globals.md)
+ Container for global context
+* [streamline/lib/streams/server/streams](lib/streams/server/streams.md)
+ Wrappers for node.js streams
+* [streamline/lib/tools/docTool](lib/tools/docTool.md)
+ Documentation tool
+* [streamline/lib/util/flows](lib/util/flows.md)
+ Flow control utilities
View
13 index.js
@@ -0,0 +1,13 @@
+/// !doc
+///
+/// # Main API
+///
+/// `var streamline = require('streamline');`
+///
+/// * `streamline.run()`
+/// runs `_node` command line analyzer / dispatcher
+exports.run = require("./lib/compiler/command").run;
+/// * `streamline.register(options)`
+/// Registers `require` handlers for streamline.
+/// `options` is a set of default options passed to the transformation.
+exports.register = require("./lib/compiler/register").register;
View
10 index.md
@@ -0,0 +1,10 @@
+
+# Main API
+
+`var streamline = require('streamline');`
+
+* `streamline.run()`
+ runs `_node` command line analyzer / dispatcher
+* `streamline.register(options)`
+ Registers `require` handlers for streamline.
+ `options` is a set of default options passed to the transformation.
View
2  lib/callbacks/builtins.js
@@ -279,6 +279,8 @@
+
+
Function.prototype.apply_ = function(callback, thisObj, args, index) {
Array.prototype.splice.call(args, ((index != null) ? index : args.length), 0, callback);
return this.apply(thisObj, args); };
View
4 lib/callbacks/transform.js
@@ -24,9 +24,7 @@
*/
/// !doc
///
-/// # streamline/lib/callbacks/transform
-///
-/// Streamline's transformation engine
+/// # Transformation engine (callback mode)
///
if (typeof exports !== 'undefined') {
var Narcissus = require('../../deps/narcissus');
View
10 lib/callbacks/transform.md
@@ -0,0 +1,10 @@
+
+# Transformation engine (callback mode)
+
+* `transformed = transform.transform(source, options)`
+ Transforms streamline source.
+ The following `options` may be specified:
+ * `tryCatch` controls exception handling
+ * `lines` controls line mapping
+ * `callback` alternative identifier if `_` is already used.
+ * `noHelpers` disables generation of helper functions (`__cb`, etc.)
View
8 lib/compiler/builtins._js
@@ -4,7 +4,7 @@
*/
/// !doc
///
-/// # streamline built-ins
+/// # Streamline built-ins
///
(function(exports) {
"use strict";
@@ -82,7 +82,7 @@
/// These variants are postfixed by an underscore.
/// They take the `_` callback as first parameter.
/// They pass the `_` callback as first arguement to their `fn` callback.
- /// They have an optional `options` second parameter which controls the level of
+ /// Most of them have an optional `options` second parameter which controls the level of
/// parallelism. This `options` parameter may be specified as `{ parallel: par }`
/// where par is an integer, or directly as a `par` integer value.
/// The `par` values are interpreted as follows:
@@ -274,8 +274,10 @@
return array;
}
+ /// ## Asychronous versions of ES5 Function functions.
+ ///
/// * `result = fn.apply_(_, thisObj, args, [index])`
- /// Helper to apply `Function.apply` to streamline functions.
+ /// Helper to use `Function.prototype.apply` with streamline functions.
/// Equivalent to `result = fn.apply(thisObj, argsWith_)` where `argsWith_` is
/// a modified argument list in which the callback has been inserted at `index`
/// (at the end of the argument list if `index` is not specified).
View
48 lib/compiler/builtins.md
@@ -0,0 +1,48 @@
+
+# Streamline built-ins
+
+## Asychronous versions of ES5 Array functions.
+
+Common Rules:
+
+These variants are postfixed by an underscore.
+They take the `_` callback as first parameter.
+They pass the `_` callback as first arguement to their `fn` callback.
+Most of them have an optional `options` second parameter which controls the level of
+parallelism. This `options` parameter may be specified as `{ parallel: par }`
+where par is an integer, or directly as a `par` integer value.
+The `par` values are interpreted as follows:
+
+* If absent or equal to 1, execution is sequential.
+* If > 1, at most `par` operations are parallelized.
+* if 0, a default number of operations are parallelized.
+ This default can be read and set with funnel.defaultSize (4 by default)
+* If < 0 or Infinity, operations are fully parallelized (no limit).
+
+API:
+
+* `array.forEach_(_[, options], fn[, thisObj])`
+ `fn` is called as `fn(_, elt, i)`.
+* `result = array.map_(_[, options], fn[, thisObj])`
+ `fn` is called as `fn(_, elt, i)`.
+* `result = array.filter_(_[, options], fn[, thisObj])`
+ `fn` is called as `fn(_, elt)`.
+* `bool = array.every_(_[, options], fn[, thisObj])`
+ `fn` is called as `fn(_, elt)`.
+* `bool = array.some_(_[, options], fn[, thisObj])`
+ `fn` is called as `fn(_, elt)`.
+* `result = array.reduce_(_, array, fn, val)`
+ `fn` is called as `val = fn(_, val, elt, i, array)`.
+* `result = flows.reduceRight(_, array, fn, val, [thisObj])`
+ reduces from end to start by applying `fn` to each element.
+ `fn` is called as `val = fn(_, val, elt, i, array)`.
+* `array = flows.sort(_, array, compare, [beg], [end])`
+ `compare` is called as `cmp = compare(_, elt1, elt2)`
+ Note: this function _changes_ the original array (and returns it)
+## Asychronous versions of ES5 Function functions.
+
+* `result = fn.apply_(_, thisObj, args, [index])`
+ Helper to use `Function.prototype.apply` with streamline functions.
+ Equivalent to `result = fn.apply(thisObj, argsWith_)` where `argsWith_` is
+ a modified argument list in which the callback has been inserted at `index`
+ (at the end of the argument list if `index` is not specified).
View
12 lib/compiler/compile._js
@@ -2,9 +2,9 @@
/// !doc
///
-/// # streamline/lib/compiler/compile
+/// # Compiler and file loader
///
-/// Streamline compiler and file loader
+/// `var compiler = require('streamline/lib/compiler/compile')`
///
var fs = require("fs");
var fspath = require("path");
@@ -112,7 +112,7 @@ exports.compileFile = function(_, js_, options) {
}
var streamlineRE = /require\s*\(\s*['"]streamline\/module['"]\s*\)\s*\(\s*module\s*,?\s*([^)]*)?\s*\)/;
-/// * `script = compile.loadFile(_, path, options)`
+/// * `script = compiler.loadFile(_, path, options)`
/// Loads Javascript file and transforms it if necessary.
/// Returns the transformed source.
/// If `path` is `foo_.js`, the source is transformed and the result
@@ -205,8 +205,8 @@ function mtimeSync(fname) {
}
}
-/// * `script = compile.transformModule(path, options)`
-/// Synchronous version of `compile.loadFile`.
+/// * `script = compiler.transformModule(path, options)`
+/// Synchronous version of `compiler.loadFile`.
/// Used by `require` logic.
exports.transformModule = function(content, path, options) {
options = _extend({}, options || {});
@@ -351,7 +351,7 @@ exports.cachedTransformSync = function(content, path, transform, options) {
var banner = _banner(transform.version);
return cachedTransformSync(content, path, { transform: transform }, banner, options);
}
-/// * `compile.compile(_, paths, options)`
+/// * `compiler.compile(_, paths, options)`
/// Compiles streamline source files in `paths`.
/// Generates a `foo.js` file for each `foo._js` file found in `paths`.
/// `paths` may be a list of files or a list of directories which
View
26 lib/compiler/compile.md
@@ -0,0 +1,26 @@
+
+# Compiler and file loader
+
+`var compiler = require('streamline/lib/compiler/compile')`
+
+* `script = compiler.loadFile(_, path, options)`
+ Loads Javascript file and transforms it if necessary.
+ Returns the transformed source.
+ If `path` is `foo_.js`, the source is transformed and the result
+ is *not* saved to disk.
+ If `path` is `foo.js` and if a `foo_.js` file exists,
+ `foo_.js` is transformed if necessary and saved as `foo.js`.
+ If `path` is `foo.js` and `foo_.js` does not exist, the contents
+ of `foo.js` is returned.
+ `options` is a set of options passed to the transformation engine.
+ If `options.force` is set, `foo_.js` is transformed even if
+ `foo.js` is more recent.
+* `script = compiler.transformModule(path, options)`
+ Synchronous version of `compiler.loadFile`.
+ Used by `require` logic.
+* `compiler.compile(_, paths, options)`
+ Compiles streamline source files in `paths`.
+ Generates a `foo.js` file for each `foo._js` file found in `paths`.
+ `paths` may be a list of files or a list of directories which
+ will be traversed recursively.
+ `options` is a set of options for the `transform` operation.
View
8 lib/fibers/builtins.js
@@ -4,7 +4,7 @@
*/
///
-/// # streamline built-ins
+/// # Streamline built-ins
///
(function(exports) {
"use strict";
@@ -82,7 +82,7 @@
/// These variants are postfixed by an underscore.
/// They take the `_` callback as first parameter.
/// They pass the `_` callback as first arguement to their `fn` callback.
- /// They have an optional `options` second parameter which controls the level of
+ /// Most of them have an optional `options` second parameter which controls the level of
/// parallelism. This `options` parameter may be specified as `{ parallel: par }`
/// where par is an integer, or directly as a `par` integer value.
/// The `par` values are interpreted as follows:
@@ -274,8 +274,10 @@
return array;
}, 0)
+ /// ## Asychronous versions of ES5 Function functions.
+ ///
/// * `result = fn.apply_(_, thisObj, args, [index])`
- /// Helper to apply `Function.apply` to streamline functions.
+ /// Helper to use `Function.prototype.apply` with streamline functions.
/// Equivalent to `result = fn.apply(thisObj, argsWith_)` where `argsWith_` is
/// a modified argument list in which the callback has been inserted at `index`
/// (at the end of the argument list if `index` is not specified).
View
2  lib/globals.js
@@ -1,6 +1,6 @@
/// !doc
///
-/// # streamline/lib/globals
+/// # Container for global context
///
/// The `streamline.lib.globals` is a container for the global `context` object which is maintained across
/// asynchronous calls.
View
15 lib/globals.md
@@ -0,0 +1,15 @@
+
+# Container for global context
+
+The `streamline.lib.globals` is a container for the global `context` object which is maintained across
+asynchronous calls.
+
+This context is very handy to store information that all calls should be able to access
+but that you don't want to pass explicitly via function parameters. The most obvious example is
+the `locale` that each request may set differently and that your low level libraries should
+be able to retrieve to format messages.
+
+* `globals.context = ctx`
+* `ctx = globals.context`
+ sets and gets the context
+
View
14 lib/index.js
@@ -1,13 +1 @@
-/// !doc
-///
-/// # streamline
-///
-/// Streamline main API
-///
-/// * `command.run()`
-/// runs `node-streamline` command line analyzer / dispatcher
-exports.run = require("./compiler/command").run;
-/// * `register.register(options)`
-/// Registers `require` handlers for streamline.
-/// `options` is a set of default options passed to the `transform` function.
-exports.register = require("./compiler/register").register;
+module.exports = require('..');
View
5 lib/streams/client/streams._js
@@ -5,9 +5,8 @@
"use strict";
/// !nodoc
///
-/// # streamline/lib/streams/client/streams
-///
-/// Client Streams module
+/// # Client Streams module
+///
/// The `streams` module contains _pull mode_ wrappers around AJAX streams.
///
// TODO: Client streams only deal with strings for now
View
4 lib/streams/server/streams._js
@@ -6,9 +6,7 @@
/// !doc
///
-/// # streamline/lib/streams/server/streams
-///
-/// Server Streams module
+/// # Wrappers for node.js streams
///
/// The `streams` module contains _pull mode_ wrappers around node streams.
///
View
204 lib/streams/server/streams.md
@@ -0,0 +1,204 @@
+
+# Wrappers for node.js streams
+
+The `streams` module contains _pull mode_ wrappers around node streams.
+
+These wrappers implement a _pull style_ API.
+Instead of having the stream _push_ the data to its consumer by emitting `data` and `end` events,
+these wrappers let the consumer _pull_ the data from the stream by calling asynchronous `read` methods.
+
+For a bit more background on this design,
+you can read [this blog post](http://bjouhier.wordpress.com/2011/04/25/asynchronous-episode-3-adventures-in-event-land/)
+
+For a simple example of this API in action,
+see the [google client example](./examples/googleClient_.js)
+
+## Emitter
+
+Base wrapper for all objects that emit an `end` or `close` event.
+All stream wrappers derive from this wrapper.
+
+* `wrapper = new streams.Emitter(stream)`
+ creates a wrapper.
+* `emitter = wrapper.emitter`
+ returns the underlying emitter. The emitter stream can be used to attach additional observers.
+* `closed = wrapper.closed`
+ returns true if the `close` event has been received.
+* `emitter = wrapper.unwrap()`
+ unwraps and returns the underlying emitter.
+ The wrapper should not be used after this call.
+
+## ReadableStream
+
+All readable stream wrappers derive from this wrapper.
+
+* `stream = new streams.ReadableStream(stream, [options])`
+ creates a readable stream wrapper.
+* `stream.setEncoding(enc)`
+ sets the encoding.
+ returns `this` for chaining.
+* `data = stream.read(_, [len])`
+ reads asynchronously from the stream and returns a `string` or a `Buffer` depending on the encoding.
+ If a `len` argument is passed, the `read` call returns when `len` characters or bytes
+ (depending on encoding) have been read, or when the underlying stream has emitted its `end` event.
+ Without `len`, the read calls returns the data chunks as they have been emitted by the underlying stream.
+ Once the end of stream has been reached, the `read` call returns `null`.
+* `data = stream.readAll(_)`
+ reads till the end of stream.
+ Equivalent to `stream.read(_, -1)`.
+* `stream.unread(chunk)`
+ pushes the chunk back to the stream.
+ returns `this` for chaining.
+
+## WritableStream
+
+All writable stream wrappers derive from this wrapper.
+
+* `stream = new streams.WritableStream(stream, [options])`
+ creates a writable stream wrapper.
+* `stream.write(_, data, [enc])`
+ Writes the data.
+ This operation is asynchronous because it _drains_ the stream if necessary.
+ If you have a lot of small write operations to perform and you don't want the overhead of draining at every step,
+ you can write to the underlying stream with `stream.emitter.write(data)` most of the time
+ and call `stream.write(_, data)` once in a while to drain.
+ Returns `this` for chaining.
+* `stream.end()`
+ signals the end of the send operation.
+ Returns `this` for chaining.
+
+## HttpServerRequest
+
+This is a wrapper around node's `http.ServerRequest`:
+This stream is readable (see Readable Stream above).
+
+* `request = new streams.HttpServerRequest(req, [options])`
+ returns a wrapper around `req`, an `http.ServerRequest` object.
+ The `options` parameter can be used to pass `lowMark` and `highMark` values.
+* `method = request.method`
+* `url = request.url`
+* `headers = request.headers`
+* `trailers = request.trailers`
+* `httpVersion = request.httpVersion`
+* `connection = request.connection`
+* `socket = request.socket`
+ (same as `http.ServerRequest`)
+
+## HttpServerResponse
+
+This is a wrapper around node's `http.ServerResponse`.
+This stream is writable (see Writable Stream above).
+
+* `response = new streams.HttpServerResponse(resp, [options])`
+ returns a wrapper around `resp`, an `http.ServerResponse` object.
+* `response.writeContinue()`
+* `response.writeHead(head)`
+* `response.setHeader(name, value)`
+* `value = response.getHeader(head)`
+* `response.removeHeader(name)`
+* `response.addTrailers(trailers)`
+* `response.statusCode = value`
+ (same as `http.ServerResponse`)
+
+## HttpServer
+
+This is a wrapper around node's `http.Server` object:
+
+* `server = streams.createHttpServer(requestListener, [options])`
+ creates the wrapper.
+ `requestListener` is called as `requestListener(request, response, _)`
+ where `request` and `response` are wrappers around `http.ServerRequest` and `http.ServerResponse`.
+* `server.listen(_, port, [host])`
+* `server.listen(_, path)`
+ (same as `http.Server`)
+
+## HttpClientResponse
+
+This is a wrapper around node's `http.ClientResponse`
+
+This stream is readable (see Readable Stream above).
+
+* `response = request.response(_)`
+ returns the response stream.
+* `status = response.statusCode`
+ returns the HTTP status code.
+* `version = response.httpVersion`
+ returns the HTTP version.
+* `headers = response.headers`
+ returns the HTTP response headers.
+* `trailers = response.trailers`
+ returns the HTTP response trailers.
+* `response.checkStatus(statuses)`
+ throws an error if the status is not in the `statuses` array.
+ If only one status is expected, it may be passed directly as an integer rather than as an array.
+ Returns `this` for chaining.
+
+## HttpClientRequest
+
+This is a wrapper around node's `http.ClientRequest`.
+
+This stream is writable (see Writable Stream above).
+
+* `request = streams.httpRequest(options)`
+ creates the wrapper.
+ The options are the following:
+ * `method`: the HTTP method, `'GET'` by default.
+ * `headers`: the HTTP headers.
+ * `url`: the requested URL (with query string if necessary).
+ * `proxy.url`: the proxy URL.
+ * `lowMark` and `highMark`: low and high water mark values for buffering (in bytes or characters depending
+ on encoding).
+ Note that these values are only hints as the data is received in chunks.
+* `response = request.response(_)`
+ returns the response.
+* `request.abort()`
+ aborts the request.
+
+## NetStream
+
+This is a wrapper around streams returned by TCP and socket clients:
+
+These streams are both readable and writable (see Readable Stream and Writable Stream above).
+
+* `stream = new streams.NetStream(stream, [options])`
+ creates a network stream wrapper.
+
+## TCP and Socket clients
+
+These are wrappers around node's `net.createConnection`:
+
+* `client = streams.tcpClient(port, host, [options])`
+ returns a TCP connection client.
+* `client = streams.socketClient(path, [options])`
+ returns a socket client.
+ The `options` parameter of the constructor provide options for the stream (`lowMark` and `highMark`).
+ If you want different options for `read` and `write` operations, you can specify them by creating `options.read` and `options.write` sub-objects inside `options`.
+* `stream = client.connect(_)`
+ connects the client and returns a network stream.
+
+## NetServer
+
+This is a wrapper around node's `net.Server` object:
+
+* `server = streams.createNetServer([serverOptions,] connectionListener [, streamOptions])`
+ creates the wrapper.
+ `connectionListener` is called as `connectionListener(stream, _)`
+ where `stream` is a `NetStream` wrapper around the native connection.
+* `server.listen(_, port, [host])`
+* `server.listen(_, path)`
+ (same as `net.Server`)
+
+## try/finally wrappers and pump
+
+* `result = streams.using(_, constructor, stream, [options,] fn)`
+ wraps `stream` with an instance of `constructor`;
+ passes the wrapper to `fn(_, wrapped)` and closes the stream after `fn` returns.
+ `fn` is called inside a `try/finally` block to guarantee that the stream is closed in all cases.
+ Returns the value returned by `fn`.
+* `result = streams.usingReadable(_, stream, [options,] fn)`
+ shortcut for `streams.using(_, streams.ReadableStream, stream, options, fn)`
+* `result = streams.usingWritable(_, stream, [options,] fn)`
+ shortcut for `streams.using(_, streams.WritableStream, stream, options, fn)`
+* `streams.pump(_, inStream, outStream)`
+ Pumps from `inStream` to `outStream`.
+ Does not close the streams at the end.
View
73 lib/tools/docTool._js
@@ -6,9 +6,7 @@
"use strict";
/// !doc
///
-/// # streamline/lib/tools/docTool
-///
-/// Documentation tool
+/// # Documentation tool
///
/// Usage:
///
@@ -23,9 +21,16 @@
///
/// The tool can also be invoked programatically with:
///
+/// `var docTool = require('streamline/lib/tools/docTool')`
+///
var fs = require('fs');
var fsp = require('path');
+function exists(cb, p) {
+ fsp.exists(p, function(result) {
+ cb(null, result)
+ });
+}
/// * `doc = docTool.generate(_, path)`
/// extracts documentation comments from file `path`
exports.generate = function(_, path, options) {
@@ -36,8 +41,12 @@ exports.generate = function(_, path, options) {
// lstat not available on Windows
var stat = (isWin32 ? fs.stat : fs.lstat)(path, _);
if (stat.isFile()) {
- if (path.match(/\._?(js|coffee)$/) && path.indexOf('--fibers.js') < 0) {
- var inside, save, example, inSource;
+ var match;
+ if ((match = /^(.*)\._?(js|coffee)$/.exec(path)) && path.indexOf('--fibers.js') < 0) {
+ var inside, save, example, inSource, tocEntry = {
+ path: match[1],
+ description: ''
+ };
var doc = fs.readFile(path, "utf8", _).split('\n').map(function(line) {
var i = line.indexOf('//' + '/ ');
if (i >= 0) {
@@ -50,6 +59,7 @@ exports.generate = function(_, path, options) {
} else if (line === "!example") {
inside = true;
example = true;
+ tocEntry.example = true;
save = true;
}
return null;
@@ -59,6 +69,11 @@ exports.generate = function(_, path, options) {
line = "```\n\n" + line;
inSource = false;
}
+ if (!tocEntry.done) {
+ if (!tocEntry.title && line[0] === '#') tocEntry.title = line;
+ else if (tocEntry.title && line.length > 0) tocEntry.description += line + '\n';
+ else if (tocEntry.description && line.length === 0) tocEntry.done = true;
+ }
return line + "\n";
}
return null;
@@ -75,34 +90,50 @@ exports.generate = function(_, path, options) {
}).filter(function(line) {
return line != null;
}).join("");
- if (inside && inSource)
- doc += "```\n\n";
- if (save) {
- fs.writeFile(path.substring(0, path.length - 3) + ".md", doc, "utf8", _);
- return "";
+ if (inside && inSource) doc += "```\n\n";
+ if (doc) {
+ if (!tocEntry.title) throw new Error(path + ": doc error: title missing");
+ var p = path.substring(0, path.lastIndexOf('.')) + ".md";
+ fs.writeFile(p, doc, "utf8", _);
+ if (options.verbose) console.log("generated " + p);
+ return [tocEntry];
}
- return doc || "";
}
- return "";
+ return null;
} else if (stat.isDirectory() && (isWin32 || !stat.isSymbolicLink())) {
var split = path.split("/");
var isPackage = split[split.length - 2] == 'node_modules';
- var doc = "";
+ var toc = [];
var files = fs.readdir(path, _);
for (var i = 0; i < files.length; i++) {
- doc += _generate(_, path + "/" + files[i], isPackage || dontSave);
+ var entries = _generate(_, path + "/" + files[i], isPackage || dontSave);
+ if (entries) toc = toc.concat(entries);
}
- if (isPackage && !dontSave && doc) {
- fs.writeFile(path + "/API.md", doc, "utf8", _);
+ if (isPackage && !dontSave && toc.length) {
+ var text;
+ if (exists(_, path + '/package.json')) {
+ var pkg = JSON.parse(fs.readFile(path + '/package.json', 'utf8', _));
+ text = '# ' + pkg.name + '\n\n' + pkg.description + '\n\n';
+ } else {
+ text = '# ' + path.substring(path.lastIndexOf('node_modules') + 13) + '\n\n';
+ }
+ text += toc.filter(function(entry) {
+ return !entry.example;
+ }).map(function(entry) {
+ var p = entry.path.substring(entry.path.lastIndexOf('node_modules') + 13);
+ var href = p.substring(p.indexOf('/') + 1) + '.md';
+ return '* [' + p + '](' + href + ') \n ' + entry.title.substring(2) + '\n';
+ }).join('');
+ fs.writeFile(path + "/API.md", text, "utf8", _);
if (options.verbose) console.log("generated " + path + "/API.md");
- doc = "";
+ return null;
}
- return doc;
- } else return "";
+ return toc;
+ } else return null;
}
+ //options.verbose = true;
_generate(_, path);
-
}
if (process.argv[1] && process.argv[1].indexOf("/docTool") >= 0) exports.generate(_, fsp.join(process.cwd(), process.argv[2] || '.'), {
verbose: true
-});
+});
View
20 lib/tools/docTool.md
@@ -0,0 +1,20 @@
+
+# Documentation tool
+
+Usage:
+
+ node streamline/lib/tools/docTool [path]
+
+Extracts documentation comments from `.js` files and generates `API.md` file
+under package root.
+
+Top of source file must contain `/// !doc` marker to enable doc extraction.
+Documentation comments must start with `/// ` (with 1 trailing space).
+Extraction can be turned off with `/// !nodoc` and turned back on with `/// !doc`.
+
+The tool can also be invoked programatically with:
+
+`var docTool = require('streamline/lib/tools/docTool')`
+
+* `doc = docTool.generate(_, path)`
+ extracts documentation comments from file `path`
View
21 lib/util/flows.js
@@ -25,16 +25,16 @@
*/
/// !doc
///
-/// # streamline/lib/util/flows
+/// # Flow control utilities
+///
+/// `var flows = require('streamline/lib/util/flows');
///
-/// Flows Module
-///
-/// The `streamline/lib/util/flows` module contains some handy utilities for streamline code
///
(function(exports) {
"use strict";
var builtins = require((Array.prototype.forEach_.fstreamlineFunction ? 'streamline/lib/fibers' : 'streamline/lib/callbacks') + '/builtins');
- /// ## Array utilities (obsolete)
+ /// !nodoc
+ /// Obsolete API
///
/// This API is obsolete. Use `array.forEach_`, `array.map_`, ... instead.
///
@@ -101,9 +101,6 @@
fn.call(thisObj, cb, elt, obj[elt]);
});
}
- ///
- /// ## Workflow Utilities
- ///
// deprecated -- don't document
exports.spray = function(fns, max) {
return new
@@ -144,6 +141,7 @@
}
+ /// !doc
/// * `fun = flows.funnel(max)`
/// limits the number of concurrent executions of a given code block.
///
@@ -187,10 +185,6 @@
}
///
- /// ## Miscellaneous
- ///
- /// Utility functions:
- ///
/// * `flows.nextTick(_)`
/// `nextTick` function for both browser and server.
/// Aliased to `process.nextTick` on the server side.
@@ -199,8 +193,7 @@
setTimeout(function() { callback(); }, 0);
};
- /// * `result = flows.apply(_, fn, thisObj, args, [index])`
- /// Obsolete. Use `fn.apply_` instead.
+ // Obsolete. Use `fn.apply_` instead.
exports.apply = function apply(cb, fn, thisObj, args, index) {
return fn.apply_(cb, thisObj, args);
}
View
33 lib/util/flows.md
@@ -0,0 +1,33 @@
+
+# Flow control utilities
+
+`var flows = require('streamline/lib/util/flows');
+
+
+* `fun = flows.funnel(max)`
+ limits the number of concurrent executions of a given code block.
+
+The `funnel` function is typically used with the following pattern:
+
+``` javascript
+// somewhere
+var myFunnel = flows.funnel(10); // create a funnel that only allows 10 concurrent executions.
+
+// elsewhere
+myFunnel(_, function(_) { /* code with at most 10 concurrent executions */ });
+```
+
+The `diskUsage2.js` example demonstrates how these calls can be combined to control concurrent execution.
+
+The `funnel` function can also be used to implement critical sections. Just set funnel's `max` parameter to 1.
+
+The funnel can be closed with `fun.close()`.
+When a funnel is closed, the operations that are still in the funnel will continue but their callbacks
+won't be called, and no other operation will enter the funnel.
+
+* `results = flows.collect(_, futures)`
+ collects the results of an array of futures
+
+* `flows.nextTick(_)`
+ `nextTick` function for both browser and server.
+ Aliased to `process.nextTick` on the server side.
View
2  package.json
@@ -6,5 +6,5 @@
"dependencies": {},
"author": "Bruno Jouhier",
"directories": {"lib": "./lib", "bin": "./bin" },
- "main": "lib/index.js"
+ "main": "index.js"
}
Please sign in to comment.
Something went wrong with that request. Please try again.