Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question: how to async with co #157

Closed
yoshuawuyts opened this issue Nov 1, 2014 · 17 comments
Closed

question: how to async with co #157

yoshuawuyts opened this issue Nov 1, 2014 · 17 comments

Comments

@yoshuawuyts
Copy link

Sometimes generators confuse the soul out of my body and I definitely don't want to be above asking help, so here goes:

How do you yield in co, within a callback without having the call stack unwrap itself. Also: how do you yield from a callback; passing in a generatorFunction directly stuff breaks.

This is the function I'm trying to make work:

function *xhrmw(next) {
  if (!this.url) return yield next;

  var opts = {
    url: this.url + qs.stringify(this.query),
    body: this.body,
    method: this.method,
    headers: this.headers
  };

  xhr(opts, function(err, res, body) {
    if (err) this.throw(err);
    this.response = res;
    this.body = body;
    // <needs a yield, cannot yield, because not a generator function>
  }.bind(this));
}
@yoshuawuyts yoshuawuyts changed the title async with co question: how to async with co Nov 1, 2014
@tj
Copy link
Owner

tj commented Nov 1, 2014

can't yield from within that callback, you'd have to change xhr to return something yield-able instead of using a callback. Generators only let you suspend within the generator function itself

@yoshuawuyts
Copy link
Author

That was the one answer I was hoping I wouldn't get, haha. Thanks for the reply though, I'll probably have to write a custom xhr module / talk it over with @Raynos (:

@rickharrison
Copy link

@yoshuawuyts You can always just wrap your xhr module in a thunk or a promise. Then you can yield that.

@yoshuawuyts
Copy link
Author

@rickharrison what would that look like? Tried fumbling around with thunkify but couldn't get it quite right.

@rickharrison
Copy link

function makeRequest() {
  // do work

  return function (done) {
    done(null, returnValue);
  }
}
yield makeRequest()

Or a promise example which might be easier to understand

var promise = new Promise(function (resolve, reject) {
  xhr(opts, function(err, res) {
    if (err) return reject(err);

    resolve(res);
  });
});

yield promise();

@yoshuawuyts
Copy link
Author

@rickharrison Thanks! I was wondering though: where does returnValue come from in the first example?

edit: to clarify, we're doing an xhr request here so doing something synchronous won't cut it :(

@Raynos
Copy link

Raynos commented Nov 1, 2014

@yoshuawuyts

function *xhrmw(next) {
  if (!this.url) return yield next;

  var opts = {
    url: this.url + qs.stringify(this.query),
    body: this.body,
    method: this.method,
    headers: this.headers
  };

  var thunk = xhr.bind(null, opts);
  var response = yield thunk;
  this.response = response;
  this.body = response.body;
  // <needs a yield>
}

@Raynos
Copy link

Raynos commented Nov 1, 2014

@yoshuawuyts

Converting from callback to thunk is trivial just var thunk = doAsyncThing.bind(null, arg). Thunk is now a function of one argument, a callback argument. Which is a thunk by definition.

This means you can now yield on it. I do this everywhere when I do generators. This allows for a single async function definition that works with node, npm & plain callbacks but is also usable from within generators.

@yoshuawuyts
Copy link
Author

@Raynos This is just wonderful, thank you so much!

@Fishrock123
Copy link

@Raynos does that have any benefits / drawbacks compared to this kind of thunk? https://github.com/repo-utils/org-labels/blob/master/bin/_org-labels#L93-L105

@navaru
Copy link
Contributor

navaru commented Nov 2, 2014

@Fishrock123 as a drawback .bind will be slower than closure format if you care about micro-optimizations

http://stackoverflow.com/questions/8656106/why-is-function-prototype-bind-slow

@Raynos
Copy link

Raynos commented Nov 2, 2014

@Fishrock123 I use .bind() because its the most convenient and minimal change at the callsite (i.e. the user of the function ) to support generators.

This puts zero burden on the author of the module and minimal burden on the user of the module in generators.

@yanickrochon
Copy link

As an alternative, I have written a module to handle this sort of pattern.

function *xhrmw(next) {
  if (!this.url) return yield next;

  var marker = suspend();
  var opts = {
    url: this.url + qs.stringify(this.query),
    body: this.body,
    method: this.method,
    headers: this.headers
  };

  xhr(opts, function(err, res, body) {
    if (err) this.throw(err);
    this.response = res;
    marker.resume(err, body);
  }.bind(this));

  this.body = yield marker.wait();
}

Shameless plug :)

@yoshuawuyts
Copy link
Author

Something that may not be apparent is that using the .bind() method the result is an array of responses. This, to me, is pretty crazy (and useful).
screen shot 2014-11-02 at 11 33 34

edit: I might be totally wrong about this one though, this behavior might be xhr specific. Generators are still weird to me, haha.

@Raynos
Copy link

Raynos commented Nov 2, 2014

@yoshuawuyts an array of argument is just co being "special" ( https://github.com/tj/co/blob/master/index.js#L59-L60 ).

@yoshuawuyts
Copy link
Author

Woah, this somehow feels a bit like golang with the multiple response assignments. It's a shame that the err, res, body pattern from xhr isn't preserved though.

@arxpoetica
Copy link

@Raynos I've looked for documentation about this nature of .bind()thunk-like behavior, and can't find it—you're definitely right, it works that way—but I'd like to understand the internals a little better. Can you (or anyone) point me in the right direction?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants