Find file
Fetching contributors…
Cannot retrieve contributors at this time
222 lines (172 sloc) 4.49 KB



Generally, callback hell problem arises when there is a bunch of interdependent async computations. Rather then trying to emulate sync control flow with verious funky utils we can be declarative and specify not HOW to compute, but WHAT to compute. Let's consider the following example:

def('a', function() {
  return 'a'

def('b', function(a) {
  return 'b'

def('c', function(a, b) {
  return a + b

def('d', function(a, c) {
  return a + c

Here we defined all our computations in a simplest possible form. For a example we just said that d is a + c. We didn't say that to compute d you should call method a() then c() and concatenate their results. Such simplicity give us many prizes:

  1. Any function can become async without breaking things
  2. We can automatically cache results and not execute computation multiple times
  3. We can easily switch between sequental and parallel execution depending on what is more appropriate for the task in hand.

This project implements control flow showed in the above example.


var flow = require('make-flow')
var fn = flow()

fn.def('a', function() {
  return 'a'

fn.def('b', function() {
  return 'b'

fn.def('c', function(a, b) {
  return a + b

fn.eval('c', function(err, c) {

The .def method defines what is called a task. Once the task and all it's dependencies were defined we can evaluate it with .eval()

Task may be async:

fn.def('config', function(done) {
 fs.readFile('config.json', done)

So done is a special case name meaning node style callback.

You can also define dependencies explicitly:

fn.def('c', ['a', 'b'], function(a, b) {
  return a + b

All computation results are stored as the properties of the flow object, so the following is true: = 'foo'
fn.eval('foo', function (err, foo) {

As you can see .eval() clobbers object on which it is called and subsequent evals do not trigger computations.

But we can reuse our definitions!

var json = flow()
.def('json', function(filename, done) {
  fs.readFile('config.json', done)
.def('object', function(json) {
  return JSON.stringify(json)

function readJson(name, cb) {
  .run() // just Object.create(this) actually
  .set('filename', name) // the same as this.filename = name
  .eval('object', cb)

Creating such functions is what make-flow where designed for. There is .fn() method which facilitates their creation a bit:

var readJson = flow()
.def('json', function(filename, done) {
  fs.readFile('config.json', done)
.def('object', function(json) {
  return JSON.stringify(json)
.fn(function (name, cb) {
  this.filename = name
  this.eval('object', cb)


We can also link computations from various runtime layers:

var app = flow()
app.layer('app') // mark current instance to be app level'app', function () {
  app.def('config', function (done) {
    readJson('config.json', done)
  app.def('db', function (config) {
    return require('monk')(config.connection_string)
app.def('session', function (db, req, done) {
  db.loadSession(req.cookie.session, done)
app.def('user', function (db, session, done) {
  db.loadUser(session.username, done)
// ...
http.createServer(function (req, res) {
  .set('req', req)
  .set('res', res)
  .eval('some task')

In the above example config and db will be evaluated only once, not for each incoming request.

Another way to attach a task to a certain level is:

app.def('level', 'name', fn)

Error handling

All error objects returned from .eval have ._task and ._stack properties:

flow().def('foo', function() {
  throw new Error('ups')
}).eval('foo', function(err) {

flow().def('bar', function(done) {
  flow().def('baz', function() {throw new Error('ups')})
    .eval('baz', done)
}).eval('bar', function(err) {

Control flow

Everything is executed sequentially.


Via npm

npm install make-flow

As a component

component install eldargab/make-flow


easy-app is a simple and powerful container with the same core ideas.