Timing-dependent behavior when inspecting chain of promises #251

bugnotme opened this Issue Apr 3, 2013 · 2 comments

2 participants


Below is test case that exhibits timing dependent behavior. An exception is raised when executing the code without any delay but inserting a delay at the specified location eliminates the problem. Why is this occurring?

// requires q.js and lodash.js

// Takes an array of promises and returns an array of promises that
// are resolved in array order
 function serialize_promises(xs) {
     var out = [];
     if(xs.length === 0){
         return out;
     var r = xs[0];
     xs.slice(1).forEach(function(p, i) {
                             r = r.then(function(data) {log("foo " + data); return p;});
     return out;

var n = [Q.defer(), Q.defer()];
out = serialize_promises(_.map(n, function(x) {return x.promise;}));

// executing the following line immediately after the preceding ones 
// causes "Uncaught TypeError: Cannot call method 'apply' of undefined" in
// q.js lines 399 and 428, chromium-browser 25.0. But waiting a few seconds causes no problem
m = _.map(out, function(x) {return x.isFulfilled();});

Sounds like a bad bug. Will investigate and fix ASAP. Thanks for reporting it!


I am unable to reproduce this, although I did replace the _.maps with normal ES5 maps (e.g. out.map(...) instead of _.map(out, ...).

Can you produce a test case that fails in a JSFiddle, preferably one that does not rely on any third party libraries which might themselves have bugs? We'll reopen if so.

@domenic domenic closed this Apr 21, 2013
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment