Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support context passing in low-level async APIs #704

Closed
petkaantonov opened this issue Feb 3, 2015 · 14 comments
Closed

Support context passing in low-level async APIs #704

petkaantonov opened this issue Feb 3, 2015 · 14 comments

Comments

@petkaantonov
Copy link
Contributor

The current low-level async APIs have performance-hostile design as each level that passes a callback requires the allocation of a closre (which is in this case the allocation of 2 objects: JSFunction object and a Context object).

For example:

fs.doSomething = function(arg, callback) {
  var req = new FSReqWrap();
  req.oncomplete = callback;
  binding.doSomething(arg, req);
};

110% of the time callback has some context it needs around once it's called, so it can never be a static function that is only allocated once per program. Also any higher level fs abstraction is going to allocate a closure for its own context and so on. These closures are relatively expensive objects, more expensive than e.g. a Promise object.

With context passing:

fs.doSomething = function(arg, callback) {
  var req = new FSReqWrap();
  req.oncomplete = callback;
  // When the request completes, it would call req.oncomplete.call(req.context, err, result);
  req.context = ... 
  binding.doSomething(arg, req);
}

This would enable extremely performant promise-wrappers (given user-land promises):

fs.doSomethingPromise = function(arg) {
  var promise = new Promise(INTERNAL);
  var req = new FSReqWrap();
  req.oncomplete = promise._resolveFromNodeBack;
  req.context = promise;
  binding.doSomething(arg, req);
  return promise;
};

The above code literally allocates less objects than a fs.doSomething(1, function closure(){}).

But it would also enable high level fs functions that are build on top of low level fs functions to allocate less closures.

Another example of context-passing include most of the ES5 array methods.

@benjamingr
Copy link
Member

The points raised here are good and interesting - I'm wondering if you can make a benchmark to back them up?

Also, to be clear - the suggestion here is to allow low level methods to take a this parameter right?

@petkaantonov
Copy link
Contributor Author

What exactly do you want me to benchmark? That allocating a closure is slower than not allocating it?

@benjamingr
Copy link
Member

@petkaantonov code using a node API (for example the filesystem API) performing better (in a real use case) with context passed than without it.

@vkurchatkin
Copy link
Contributor

As far as I understand you are proposing that instead of

var whatever  = ....

fs.someFunction(function(err, res) {
   console.log(whatever);
});

people should do

var whatever  = ....

fs.someFunction(function(err, res, context) {
   var whatever = context;
   console.log(whatever);
}, whatever);

@petkaantonov
Copy link
Contributor Author

Not in public facing API... let's take a look at readFile for instance.. it allocates 7 closures when it could simply allocate 1 FileReadingContext and pass that around. You really misunderstood me, this is an internal optimization at lowest level, not some api change.

@vkurchatkin
Copy link
Contributor

Well, adding context support to FSReqWrap is simple. But all functions used inside readFile are public, so we basically expose this API to users as well

@petkaantonov
Copy link
Contributor Author

Well you don't need to rewrite anything, supporting it in FSReqWrap (and equivalent in other modules) is already enough for me. I am just saying that iojs itself could get also a perf win from this for many methods if it wants. You don't need to touch readFile at all. And if you were to refactor readFile you would obviously call internal methods that accept context if you didn't want to expose context passing to users.

Here's an example how readFile could be implemented were this supported by FSReqWrap: https://gist.github.com/petkaantonov/0b23d1c18404baf65dec , it reduces object allocations in successful case from 9 (6 JSFunctions+3 Contexts) to 1 (JSObject).

@mscdex
Copy link
Contributor

mscdex commented Feb 3, 2015

👍

@vkurchatkin
Copy link
Contributor

Ok, it seems like we already have everything to do this:

var binding = process.binding('fs');
var open = binding.open;
var FSReqWrap = binding.FSReqWrap;
var constants = process.binding('constants');


var req = new FSReqWrap();
req.oncomplete = oncomplete;
req.context = 'whatever';

open('test.js', constants.O_RDONLY, 0o666, req);

function oncomplete(err, fd) {

  if (err)
    throw err;

  console.log(fd);
  console.log(this.context)

}

@petkaantonov
Copy link
Contributor Author

Yea that works for me, thanks

@vkurchatkin
Copy link
Contributor

I tried this approach out and can't see significant changes in benchmarks

before

fs/readfile.js dur=5 len=1024 concurrent=1: 22961.59914
fs/readfile.js dur=5 len=1024 concurrent=10: 28115.44754
fs/readfile.js dur=5 len=16777216 concurrent=1: 160.76778
fs/readfile.js dur=5 len=16777216 concurrent=10: 325.55294

after

fs/readfile.js dur=5 len=1024 concurrent=1: 23862.05061
fs/readfile.js dur=5 len=1024 concurrent=10: 27368.07561
fs/readfile.js dur=5 len=16777216 concurrent=1: 160.06920
fs/readfile.js dur=5 len=16777216 concurrent=10: 326.91924

@petkaantonov
Copy link
Contributor Author

Those benchmarks measure file reading speed which is not affected, to measure the call speed you probably want to read a ton of unique short files like 16 bytes long or such. E.g. create different 1000 files of 16 bytes length and then call readFile on them in parallel...

@vkurchatkin
Copy link
Contributor

nope, same thing

@petkaantonov
Copy link
Contributor Author

@vkurchatkin I'll construct a relevant benchmark on the weekend when I have more time for this

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants