New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3x faster setImmediate #6436

Closed
wants to merge 12 commits into
from

Conversation

Projects
None yet
@andrasq
Contributor

andrasq commented Apr 28, 2016

Checklist
  • tests and code linting passes
  • a test and/or benchmark is included
  • the commit message follows commit guidelines
Affected core subsystem(s)
Description of change

Sped up setImmediate processing by 60% to over 400% through faster linkedlist creation,
faster immediate object creation, not wrapping closures around callbacks, and invoking
the callbacks faster from an optimizable function without using .call

Sped up clearImmediate by delaying the linked list update until processImmediate pulls the
immediate off the queue, where it's done more efficiently. This speeds up clearImmediate 3x,
with the benefit stacking on top of the setImmediate speedup.

Added benchmarks for setImmediate and clearImmediate.

andrasq added some commits Apr 26, 2016

timers: linkedlist optimizations
use L.create() factory to create access-optimized linkedlist objects
timers: 3x faster setImmediate
Save the setImmediate callback arguments into an array instead of a
closure, and invoke the callback on the arguments from an optimizable
function.

  60% faster setImmediate with 0 args (15% if self-recursive)
  4x faster setImmediate with 1-3 args, 2x with > 3
  seems to be faster with less memory pressure when memory is tight

Changes:
- use L.create() to build faster lists
- use runCallback() from within tryOnImmediate
- create immediate timers with a function instead of new
- just save the arguments and not build closures for the callbacks
timers: 3x faster clearImmediate
Instead of unlinking from the immediate queue immediate on clear,
put off the unlink until processImmediate where it's more efficient.

  3x faster clearImmediate processing

  The the benefits stack with the setImmediate speedups, ie total gain
  of 3.5x with 0 arguments and 4-5x with 1-3 args.

Changed the code to defer unlinking from the immediate queue until
processImmediate consumes the queue anyway.
timers: setImmediate benchmarks
Timings for sequential and concurren setImmediate with and without
arguments, and set + clearImmediate.
lib/internal/linkedlist.js
@@ -6,6 +6,13 @@ function init(list) {
}
exports.init = init;
+// create a new linked list
+function create() {
+ var list = { _idleNext: null, _idlePrev: null };

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

This should be const.

@mscdex

mscdex Apr 28, 2016

Contributor

This should be const.

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

I also wonder if there would be any performance benefit to instead creating a new instance of an object that doesn't inherit from Object.prototype? For example:

function LinkedListNode() {
  this._idleNext = null;
  this._idlePrev = null;
}
LinkedListNode.prototype = Object.create(null);

function create() {
  const list = new LinkedListNode();
  init(list);
  return list;
}
@mscdex

mscdex Apr 28, 2016

Contributor

I also wonder if there would be any performance benefit to instead creating a new instance of an object that doesn't inherit from Object.prototype? For example:

function LinkedListNode() {
  this._idleNext = null;
  this._idlePrev = null;
}
LinkedListNode.prototype = Object.create(null);

function create() {
  const list = new LinkedListNode();
  init(list);
  return list;
}

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

changed to const.

I tried the Object.create version above, and it's inclusive. Some test runs (10-20 sec runs)
show a 2% advantage one way, then changing the loop/repeat count (where loops*repeats
is the number of objects created) flips the advantage the other way.

@andrasq

andrasq Apr 28, 2016

Contributor

changed to const.

I tried the Object.create version above, and it's inclusive. Some test runs (10-20 sec runs)
show a 2% advantage one way, then changing the loop/repeat count (where loops*repeats
is the number of objects created) flips the advantage the other way.

lib/timers.js
+function createImmediate(callback) {
+ return {

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

Similarly here RE: creating a new instance with a no prototype inheritance

@mscdex

mscdex Apr 28, 2016

Contributor

Similarly here RE: creating a new instance with a no prototype inheritance

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

no difference; see reply above

@andrasq

andrasq Apr 28, 2016

Contributor

no difference; see reply above

lib/timers.js
- var immediate = new Immediate();
-
- L.init(immediate);
+ var immediate = createImmediate(callback);

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

This should be const too.

@mscdex

mscdex Apr 28, 2016

Contributor

This should be const too.

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

fixed

@andrasq

andrasq Apr 28, 2016

Contributor

fixed

test/parallel/test-timers-linked-list.js
@@ -103,3 +103,8 @@ assert.equal(C, L.shift(list));
// list
assert.ok(L.isEmpty(list));
+var list2 = L.create();
+var list3 = L.create();

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

These variables should be const as well.

@mscdex

mscdex Apr 28, 2016

Contributor

These variables should be const as well.

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

fixed

@andrasq

andrasq Apr 28, 2016

Contributor

fixed

@mscdex

This comment has been minimized.

Show comment
Hide comment
@mscdex

mscdex Apr 28, 2016

Contributor

Perhaps the benchmarks could more closely resemble the existing setImmediate() benchmarks in benchmark/misc (which probably need to be moved to benchmark/timers now, but that is a separate PR)? Specifically when I landed setImmediate() performance improvements awhile back, @bnoordhuis had commented that the callbacks should not reference variables in parent scopes if at all possible.

Contributor

mscdex commented Apr 28, 2016

Perhaps the benchmarks could more closely resemble the existing setImmediate() benchmarks in benchmark/misc (which probably need to be moved to benchmark/timers now, but that is a separate PR)? Specifically when I landed setImmediate() performance improvements awhile back, @bnoordhuis had commented that the callbacks should not reference variables in parent scopes if at all possible.

-function Immediate() { }

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

this is a breaking change

@Fishrock123

Fishrock123 Apr 28, 2016

Member

this is a breaking change

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Ok, nvm, #6206 has not landed yet

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Ok, nvm, #6206 has not landed yet

This comment has been minimized.

@bengl

bengl May 5, 2016

Member

I don't think this would be breaking if #6206 had landed, as it doesn't get rid of the Immediate class, and instead just moves the property assignments into the constructor.

@bengl

bengl May 5, 2016

Member

I don't think this would be breaking if #6206 had landed, as it doesn't get rid of the Immediate class, and instead just moves the property assignments into the constructor.

This comment has been minimized.

@benjamingr

benjamingr May 5, 2016

Member

That code changed since the comment was left here.

@benjamingr

benjamingr May 5, 2016

Member

That code changed since the comment was left here.

@@ -42,10 +49,17 @@ exports.remove = remove;
// remove a item from its list and place at the end.
function append(list, item) {
- remove(item);
+ if (item._idleNext || item._idlePrev) {

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Would the common case of no trailing list element make it better to check these in the opposite order?

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Would the common case of no trailing list element make it better to check these in the opposite order?

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

linkedlist.js implements a circular list, the expected case is that both will be set or not set together.
This is just a fast-path optimization for newly created list nodes that have never been linked.

@andrasq

andrasq Apr 28, 2016

Contributor

linkedlist.js implements a circular list, the expected case is that both will be set or not set together.
This is just a fast-path optimization for newly created list nodes that have never been linked.

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Ah, right. Forgot about that.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Ah, right. Forgot about that.

- if (L.isEmpty(immediateQueue)) {
- process._needImmediateCallback = false;
- }
+ // leave queued, much faster overall to skip it later in processImmediate

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

much faster

I don't think this is true, L.remove() is a constant-time operation. The performance gained will be minimal at best, insignificant either way.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

much faster

I don't think this is true, L.remove() is a constant-time operation. The performance gained will be minimal at best, insignificant either way.

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

the result is very reproducible, it's 3x faster to set/clear without the remove.
It could be caching effects, ie during processing next item has already been prefetched
when the previous one was removed for processing.

@andrasq

andrasq Apr 28, 2016

Contributor

the result is very reproducible, it's 3x faster to set/clear without the remove.
It could be caching effects, ie during processing next item has already been prefetched
when the previous one was removed for processing.

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

I don't think it's ever possible to even have this show 1% in a v8 profile of an actual application, could you compile some benchmark results for us on these?

@Fishrock123

Fishrock123 Apr 28, 2016

Member

I don't think it's ever possible to even have this show 1% in a v8 profile of an actual application, could you compile some benchmark results for us on these?

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

I would expect actual applications make very few calls to clearImmediate
so you're most likely right. I was focusing on the effect of deferring the remove a bit.

The output of test from benchmark/timers/immediate.js
With the remove as it existed:
timers/immediate.js thousands=2000 type=clear: 1388.82567
Modified to not remove (ie, the git diff shown above for lines 629-633 vs 639):
timers/immediate.js thousands=2000 type=clear: 3536.98189
The test:

function clear(N) {
  bench.start();
  function cb(a1) {
    if (a1 === 2)
      bench.end(N / 1e3);
  }
  for (var i = 0; i < N; i++) {
    clearImmediate(setImmediate(cb, 1));
  }
  setImmediate(cb, 2);
}

This test creates 2m cleared immediates and shows 254% speedup.
My other test that loops 10k gets 302%.

@andrasq

andrasq Apr 28, 2016

Contributor

I would expect actual applications make very few calls to clearImmediate
so you're most likely right. I was focusing on the effect of deferring the remove a bit.

The output of test from benchmark/timers/immediate.js
With the remove as it existed:
timers/immediate.js thousands=2000 type=clear: 1388.82567
Modified to not remove (ie, the git diff shown above for lines 629-633 vs 639):
timers/immediate.js thousands=2000 type=clear: 3536.98189
The test:

function clear(N) {
  bench.start();
  function cb(a1) {
    if (a1 === 2)
      bench.end(N / 1e3);
  }
  for (var i = 0; i < N; i++) {
    clearImmediate(setImmediate(cb, 1));
  }
  setImmediate(cb, 2);
}

This test creates 2m cleared immediates and shows 254% speedup.
My other test that loops 10k gets 302%.

lib/timers.js
+ _idleNext: null,
+ _idlePrev: null,
+ _callback: callback,
+ _argv: null,

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Maybe even better: createImmediate(callback, argv)

@Fishrock123

Fishrock123 Apr 28, 2016

Member

Maybe even better: createImmediate(callback, argv)

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

I tried passing two params to the create function, it was slower (but I was
timing passing in the domain). I'll double-check.

@andrasq

andrasq Apr 28, 2016

Contributor

I tried passing two params to the create function, it was slower (but I was
timing passing in the domain). I'll double-check.

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

Hmm. So passing two args to the create function runs at the same speed, but moving the
create to below the args gathering and leaving the var as const drops the speed.
Creating the immediate as var immediate = createImmediate(callback, args) is 50% faster
than the almost identical const immediate = ... . In that location -- above the switch
const and var run the same. This I don't understand (maybe a v8 scoping optimization?)

Stylistically, I like not having to poke attributes into the object, but creating it up top
and populating it below may be a clearer flow, it's how setTimeout and setInterval work.

@andrasq

andrasq Apr 28, 2016

Contributor

Hmm. So passing two args to the create function runs at the same speed, but moving the
create to below the args gathering and leaving the var as const drops the speed.
Creating the immediate as var immediate = createImmediate(callback, args) is 50% faster
than the almost identical const immediate = ... . In that location -- above the switch
const and var run the same. This I don't understand (maybe a v8 scoping optimization?)

Stylistically, I like not having to poke attributes into the object, but creating it up top
and populating it below may be a clearer flow, it's how setTimeout and setInterval work.

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

@andrasq If you add --trace-opt --trace-deopt to the command line, do you see any aborted optimizations or deopts related to the containing function? I have seen specific uses of const that have caused aborted optimizations which are very noticeable when benchmarking.

@mscdex

mscdex Apr 28, 2016

Contributor

@andrasq If you add --trace-opt --trace-deopt to the command line, do you see any aborted optimizations or deopts related to the containing function? I have seen specific uses of const that have caused aborted optimizations which are very noticeable when benchmarking.

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

@mscdex good tip! It's [aborted optimizing 0x21a4311d <JS Function exports.setImmediate (SharedFunctionInfo 0x21a41ce5)> because: Unsupported phi use of const variable]

@andrasq

andrasq Apr 28, 2016

Contributor

@mscdex good tip! It's [aborted optimizing 0x21a4311d <JS Function exports.setImmediate (SharedFunctionInfo 0x21a41ce5)> because: Unsupported phi use of const variable]

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

Yes, so it looks like it will need to be kept var then. My guess is that v8 still hasn't optimized all const use cases yet.

@mscdex

mscdex Apr 28, 2016

Contributor

Yes, so it looks like it will need to be kept var then. My guess is that v8 still hasn't optimized all const use cases yet.

This comment has been minimized.

@mscdex

mscdex Apr 28, 2016

Contributor

Although var should be changed to const wherever v8 doesn't abort optimizations.

@mscdex

mscdex Apr 28, 2016

Contributor

Although var should be changed to const wherever v8 doesn't abort optimizations.

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 28, 2016

Member

I tried passing two params to the create function, it was slower (but I was
timing passing in the domain). I'll double-check.

Could you post some numbers on this, what are we talking about? Microseconds, nanoseconds?

@Fishrock123

Fishrock123 Apr 28, 2016

Member

I tried passing two params to the create function, it was slower (but I was
timing passing in the domain). I'll double-check.

Could you post some numbers on this, what are we talking about? Microseconds, nanoseconds?

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

it was small, nanoseconds, but I didn't record it and can't reproduce it now.
I re-timed passing two parameters to the create function and compiled into node
the two-argument form is distinctly faster, so I'll change this (my test rig loads
the file under test with require, so can run the same timers.js source under
various engines. I'd been comparing v0.10.42 vs v4.4.0 vs v5.10.1 vs v6.0.0)

@andrasq

andrasq Apr 28, 2016

Contributor

it was small, nanoseconds, but I didn't record it and can't reproduce it now.
I re-timed passing two parameters to the create function and compiled into node
the two-argument form is distinctly faster, so I'll change this (my test rig loads
the file under test with require, so can run the same timers.js source under
various engines. I'd been comparing v0.10.42 vs v4.4.0 vs v5.10.1 vs v6.0.0)

This comment has been minimized.

@andrasq

andrasq Apr 28, 2016

Contributor

pushed the edits (passing both callback and arguments to createImmediate())

@andrasq

andrasq Apr 28, 2016

Contributor

pushed the edits (passing both callback and arguments to createImmediate())

@andrasq

This comment has been minimized.

Show comment
Hide comment
@andrasq

andrasq Apr 28, 2016

Contributor

@mscdex re the benchmarks, oops, I didn't realize misc already had immediate tests.
I patterned the ones I wrote on the timers/timeout.js benchmarks ("patterned" as in cut-and-paste
reused them, pretty much).

Each test function is called once, the closure should be built once and reused across all callbacks.
The tests are strctured as eg

function test() {
  function cb() { if ... bench.end(K) }
  for (...) { setImmediate(cb) }
}

Changing it to

  for (...) { setImmediate(function cb() { if ... bench.end(K) }) }

halves the throughput, suggesting that the closure is reused in the above case and
not reused below. (The tests also measure the cost of passing arguments to the callback,
and the 0-argument depth test needs a closure to know when to stop.)

A different question is whether my benchmarks add enough value to keep, or if I should just ditch them
in favor of the ones that already exist.

Contributor

andrasq commented Apr 28, 2016

@mscdex re the benchmarks, oops, I didn't realize misc already had immediate tests.
I patterned the ones I wrote on the timers/timeout.js benchmarks ("patterned" as in cut-and-paste
reused them, pretty much).

Each test function is called once, the closure should be built once and reused across all callbacks.
The tests are strctured as eg

function test() {
  function cb() { if ... bench.end(K) }
  for (...) { setImmediate(cb) }
}

Changing it to

  for (...) { setImmediate(function cb() { if ... bench.end(K) }) }

halves the throughput, suggesting that the closure is reused in the above case and
not reused below. (The tests also measure the cost of passing arguments to the callback,
and the 0-argument depth test needs a closure to know when to stop.)

A different question is whether my benchmarks add enough value to keep, or if I should just ditch them
in favor of the ones that already exist.

@@ -502,24 +502,26 @@ Timeout.prototype.close = function() {
};
-var immediateQueue = {};
-L.init(immediateQueue);
+var immediateQueue = L.create();

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't really understand why this change is needed or is helpful but I guess it does clean things up a little.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't really understand why this change is needed or is helpful but I guess it does clean things up a little.

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 29, 2016

Member

prevents a hidden class change

@Fishrock123

Fishrock123 Apr 29, 2016

Member

prevents a hidden class change

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

at top-level it was changed to match processImmediate (and it cleans things up a bit).
In processImmediate it's a speedup, L.create() returns an access-optimized object.

@andrasq

andrasq Apr 29, 2016

Contributor

at top-level it was changed to match processImmediate (and it cleans things up a bit).
In processImmediate it's a speedup, L.create() returns an access-optimized object.

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

I bet the object ends up being access-optimized after enough runs anyway. Not to mention we can make objects access-optimized explicitly.

That said - this change improves style anyway and is better coding.

@benjamingr

benjamingr Apr 29, 2016

Member

I bet the object ends up being access-optimized after enough runs anyway. Not to mention we can make objects access-optimized explicitly.

That said - this change improves style anyway and is better coding.

This comment has been minimized.

@andrasq

andrasq Apr 30, 2016

Contributor

probably, though depends on how many immediates are queued in an event loop cycle.
The immediateQueue is re-created every time, so the optimization would not persist
(and there would be a run-time cost for the conversion).

Out of curiosity, what are the ways of creating access-optimized objects? I know about
objects created with { ... }, the prototype properties of new objects, (the this. properties
as assigned in the constructor?), and assigning an object as the prototype of a function
forces optimization. Any others?

@andrasq

andrasq Apr 30, 2016

Contributor

probably, though depends on how many immediates are queued in an event loop cycle.
The immediateQueue is re-created every time, so the optimization would not persist
(and there would be a run-time cost for the conversion).

Out of curiosity, what are the ways of creating access-optimized objects? I know about
objects created with { ... }, the prototype properties of new objects, (the this. properties
as assigned in the constructor?), and assigning an object as the prototype of a function
forces optimization. Any others?

This comment has been minimized.

@benjamingr

benjamingr Apr 30, 2016

Member

@andrasq conveniently, here is a StackOverflow answer I wrote about a technique petkaantonov used in bluebird (with making an object as a prototype of a function).

This also works with Object.create so I guess that's another one.

this. properties assigned in the constructor is the "standard" way though, it's even "easier" than object literals and it makes the fact it's static obvious to v8 (object literals work fine too usually).

@benjamingr

benjamingr Apr 30, 2016

Member

@andrasq conveniently, here is a StackOverflow answer I wrote about a technique petkaantonov used in bluebird (with making an object as a prototype of a function).

This also works with Object.create so I guess that's another one.

this. properties assigned in the constructor is the "standard" way though, it's even "easier" than object literals and it makes the fact it's static obvious to v8 (object literals work fine too usually).

This comment has been minimized.

@benjamingr

benjamingr Apr 30, 2016

Member

Just to be explicit - this specific change LGTM, even if it's not faster but it probably is given the old code.

@benjamingr

benjamingr Apr 30, 2016

Member

Just to be explicit - this specific change LGTM, even if it's not faster but it probably is given the old code.

This comment has been minimized.

@andrasq

andrasq Apr 30, 2016

Contributor

that's a great writeup, and a handy reference, thanks!

@andrasq

andrasq Apr 30, 2016

Contributor

that's a great writeup, and a handy reference, thanks!

lib/timers.js
@@ -502,24 +502,26 @@ Timeout.prototype.close = function() {
};
-var immediateQueue = {};
-L.init(immediateQueue);
+var immediateQueue = L.create();
function processImmediate() {
var queue = immediateQueue;

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

While we're touching this code, const might be nice.

@benjamingr

benjamingr Apr 29, 2016

Member

While we're touching this code, const might be nice.

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

changed

@andrasq

andrasq Apr 29, 2016

Contributor

changed

@@ -554,67 +557,73 @@ function tryOnImmediate(immediate, queue) {
}
}
+function runCallback(timer) {

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

Are we sure we don't do this optimization anywhere else in the code more generically? It sounds like something that would go in util. If we don't then LGTM.

@benjamingr

benjamingr Apr 29, 2016

Member

Are we sure we don't do this optimization anywhere else in the code more generically? It sounds like something that would go in util. If we don't then LGTM.

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

I use this construct in other projects, and found it hard to make generic;
the call semantics don't cover all uses cases well. I have 3 variants,
this version I used here is a 4th.

That being said, I agree, an optimized call invoker is a nice utility to have.

@andrasq

andrasq Apr 29, 2016

Contributor

I use this construct in other projects, and found it hard to make generic;
the call semantics don't cover all uses cases well. I have 3 variants,
this version I used here is a 4th.

That being said, I agree, an optimized call invoker is a nice utility to have.

lib/timers.js
@@ -554,67 +557,73 @@ function tryOnImmediate(immediate, queue) {
}
}
+function runCallback(timer) {
+ var argv = timer._argv;
+ var argc = argv ? argv.length : 0;

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

const

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

fixed

@andrasq

andrasq Apr 29, 2016

Contributor

fixed

lib/timers.js
+function createImmediate(callback, args) {

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't understand why we need the stylistic change of returning an Object rather than using a constructor. I think it would be more consistent to still have a function Immediate that sets all these properties on this. It would change less things and would be as fast.

That said, setting properties locally rather than on the prototype definitely helps with hidden classes and is preferable.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't understand why we need the stylistic change of returning an Object rather than using a constructor. I think it would be more consistent to still have a function Immediate that sets all these properties on this. It would change less things and would be as fast.

That said, setting properties locally rather than on the prototype definitely helps with hidden classes and is preferable.

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 29, 2016

Member

If that is possible then we should probably go that way imo.

@Fishrock123

Fishrock123 Apr 29, 2016

Member

If that is possible then we should probably go that way imo.

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

I tried various constructs, used the fastest, and used style as a tie-breaker.

I timed the new Immediate(callback, args) version setting properties on this with
and without a prototype, and the current static object is 50% faster on the breadth tests
in benchmarks/timers/immediate.js (10% for the slow 4+ args case)

my experimental Immediate was (a separate test predeclared all properties as null)

function Immediate(callback, args) {
  this._callback = callback;
  this._argv = args;
  this._onImmediate = callback;
  this.domain = process.domain;
}
@andrasq

andrasq Apr 29, 2016

Contributor

I tried various constructs, used the fastest, and used style as a tie-breaker.

I timed the new Immediate(callback, args) version setting properties on this with
and without a prototype, and the current static object is 50% faster on the breadth tests
in benchmarks/timers/immediate.js (10% for the slow 4+ args case)

my experimental Immediate was (a separate test predeclared all properties as null)

function Immediate(callback, args) {
  this._callback = callback;
  this._argv = args;
  this._onImmediate = callback;
  this.domain = process.domain;
}

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't buy it that new Immediate was slower than returning static object. It should be as fast while providing better debug traces (because it is named) - both get the same hidden class exactly.

Putting object properties on the prototype is a perf killer. Not suggesting we stick to that. Definitely going to see an improvement compared to prototype properties anyway.

@benjamingr

benjamingr Apr 29, 2016

Member

I don't buy it that new Immediate was slower than returning static object. It should be as fast while providing better debug traces (because it is named) - both get the same hidden class exactly.

Putting object properties on the prototype is a perf killer. Not suggesting we stick to that. Definitely going to see an improvement compared to prototype properties anyway.

This comment has been minimized.

@andrasq

andrasq Apr 30, 2016

Contributor

it's weird, I just ran a set of micro-benchmarks and they're exactly like you say,
setting this.x properties is very fast, and 50% faster than inheriting from prototype.
But inside timers.js a { ... } style struct is faster.

I ran the same benchmark side-by-side with --trace-gc, and the new Immediate version
prints 20% more gc scavenge lines (207 vs 248). But it also runs 20% longer, 5200
vs 6250 ms, so the gc may be normal (proportional to runtime), not sure. The only change
was how the immediate objects are allocated, createImmediate vs new Immediate
with the object

function Immediate(callback, args) {
    this._idleNext = null;
    this._idlePrev = null;
    this._callback = callback;
    this._argv = args;
    this._onImmediate = callback;
    this.domain = process.domain;
}

The timers/immediate.js breadth benchmark shows a larger 55% difference in favor of { ... }
(the 0 and 1 arg cases; the 4-args case is 11%)

@andrasq

andrasq Apr 30, 2016

Contributor

it's weird, I just ran a set of micro-benchmarks and they're exactly like you say,
setting this.x properties is very fast, and 50% faster than inheriting from prototype.
But inside timers.js a { ... } style struct is faster.

I ran the same benchmark side-by-side with --trace-gc, and the new Immediate version
prints 20% more gc scavenge lines (207 vs 248). But it also runs 20% longer, 5200
vs 6250 ms, so the gc may be normal (proportional to runtime), not sure. The only change
was how the immediate objects are allocated, createImmediate vs new Immediate
with the object

function Immediate(callback, args) {
    this._idleNext = null;
    this._idlePrev = null;
    this._callback = callback;
    this._argv = args;
    this._onImmediate = callback;
    this.domain = process.domain;
}

The timers/immediate.js breadth benchmark shows a larger 55% difference in favor of { ... }
(the 0 and 1 arg cases; the 4-args case is 11%)

This comment has been minimized.

@benjamingr

benjamingr Apr 30, 2016

Member

Care to share the actual code you benched in both cases so I could have a look?

@benjamingr

benjamingr Apr 30, 2016

Member

Care to share the actual code you benched in both cases so I could have a look?

This comment has been minimized.

@benjamingr

benjamingr Apr 30, 2016

Member

Ok, I double checked and it's identical in terms of performance. I think we should go with new Immediate and fix the benchmarks.

@benjamingr

benjamingr Apr 30, 2016

Member

Ok, I double checked and it's identical in terms of performance. I think we should go with new Immediate and fix the benchmarks.

This comment has been minimized.

@andrasq

andrasq May 1, 2016

Contributor

How did you test? I'm still seeing a very consisten difference with the sources from
this pull request. The following test can be run standalone to show the difference:

var n = 0;
function counter() { n += 1; }

var count = 1000000;
var t1 = Date.now();
for (var i=0; i<count; i++) setImmediate(counter);
setImmediate(function() {
    var t2 = Date.now();
    console.log("ran %dk setImmediate in %d ms (n calls)", count/1000, (t2 - t1), n + 1, process.memoryUsage());
})

createImmediate version: (run with $ taskset 8 ./node time-setImmediate.js):
ran 1000k setImmediate in 301 ms (n calls) 1000001 { rss: 64249856, heapTotal: 51257344, heapUsed: 38976796 }
new Immediate version:
ran 1000k setImmediate in 5295 ms (n calls) 1000001 { rss: 97222656, heapTotal: 82714624, heapUsed: 66396932 }

This is much more extreme a difference than in the other tests, but it's consistent.
The rss and heap footprint is much larger, I'm seeing a lot of calls to futex() and 12k
context switches / second while it's running. I don't know what it's doing to run so slow.
On other tests I'm seeing a steady 15% difference.

The "other test" is the one-liner from a larger suite

case '4a1':
  runit(nloops, 200, 10000, "set and run 10k setImmediate, 1 arg", function(cb) { for (var i=0; i<10000; i++) setImmediate(f, 1); setImmediate(cb) });
  break;

I packaged up my timeit / runit utility as qtimeit, so the test can be run in isolation

timeit = require('qtimeit');
function f(){}
timeit.runit(10, 200, 10000, "set and run 10k setImmediate, 1 arg", function(cb) { for (var i=0; i<10000; i++) setImmediate(f, 1); setImmediate(cb) });

createImmediate: 200 loops in 0.5001 sec: 399.92 / sec,
new Immediate: 200 loops in 0.5859 sec: 341.35 / sec,

@andrasq

andrasq May 1, 2016

Contributor

How did you test? I'm still seeing a very consisten difference with the sources from
this pull request. The following test can be run standalone to show the difference:

var n = 0;
function counter() { n += 1; }

var count = 1000000;
var t1 = Date.now();
for (var i=0; i<count; i++) setImmediate(counter);
setImmediate(function() {
    var t2 = Date.now();
    console.log("ran %dk setImmediate in %d ms (n calls)", count/1000, (t2 - t1), n + 1, process.memoryUsage());
})

createImmediate version: (run with $ taskset 8 ./node time-setImmediate.js):
ran 1000k setImmediate in 301 ms (n calls) 1000001 { rss: 64249856, heapTotal: 51257344, heapUsed: 38976796 }
new Immediate version:
ran 1000k setImmediate in 5295 ms (n calls) 1000001 { rss: 97222656, heapTotal: 82714624, heapUsed: 66396932 }

This is much more extreme a difference than in the other tests, but it's consistent.
The rss and heap footprint is much larger, I'm seeing a lot of calls to futex() and 12k
context switches / second while it's running. I don't know what it's doing to run so slow.
On other tests I'm seeing a steady 15% difference.

The "other test" is the one-liner from a larger suite

case '4a1':
  runit(nloops, 200, 10000, "set and run 10k setImmediate, 1 arg", function(cb) { for (var i=0; i<10000; i++) setImmediate(f, 1); setImmediate(cb) });
  break;

I packaged up my timeit / runit utility as qtimeit, so the test can be run in isolation

timeit = require('qtimeit');
function f(){}
timeit.runit(10, 200, 10000, "set and run 10k setImmediate, 1 arg", function(cb) { for (var i=0; i<10000; i++) setImmediate(f, 1); setImmediate(cb) });

createImmediate: 200 loops in 0.5001 sec: 399.92 / sec,
new Immediate: 200 loops in 0.5859 sec: 341.35 / sec,

This comment has been minimized.

@andrasq

andrasq May 1, 2016

Contributor

Hmm. When I patch the original v6.0.0 branch with the setImmediate function and
create var immediate = createImmediate(null, null) and leave everything else alone,
there is a straight-forward 24% speedup from 524ms to 420ms for 1 million immediates
(ran the stand-alone test above). No context switching, no huge slowdown.
I'll try to isolate the cause.

Edit: clarifiction: the 524ms is the new setImmediate without a prototype, I didn't notice that
I had grabbed that too from the diffs. The original Immediate object (with a prototype)
runs in 1017ms.

@andrasq

andrasq May 1, 2016

Contributor

Hmm. When I patch the original v6.0.0 branch with the setImmediate function and
create var immediate = createImmediate(null, null) and leave everything else alone,
there is a straight-forward 24% speedup from 524ms to 420ms for 1 million immediates
(ran the stand-alone test above). No context switching, no huge slowdown.
I'll try to isolate the cause.

Edit: clarifiction: the 524ms is the new setImmediate without a prototype, I didn't notice that
I had grabbed that too from the diffs. The original Immediate object (with a prototype)
runs in 1017ms.

This comment has been minimized.

@andrasq

andrasq May 1, 2016

Contributor

I found the reason, and it makes no sense, but the good news is that the workaround is
easy, and yes new Immediate is just as fast, and the weird super-slow performance is gone.
(and the weird context switches and rss/heap discrepancy also go away)

If new Immediate() stores to this a passed-in function, the code runs slow.
If it's not passed in, or if the callback is not stored, the call runs fast. The args
parameter can be passed and assigned without penalty. Annotating the created object
is very fast and also incurs no penalty. (I have not been able to replicate this anomaly
standalone.)

Measuring small runs of tasks with a tool that subtracts the timing loop overhead,
the difference between createImmediate and new Immediate is 7% or so, close enough.
The microbenchmarks use a much much larger dataset (2m and 4m vs 10k) which seems
to favor createImmediate (by 70%), but the smaller dataset is more likely.

Either way, the immediates processing rate is over 250% faster than before this change
(standalone microbenchmark) or 350% (small runs, overhead subtracted).

@andrasq

andrasq May 1, 2016

Contributor

I found the reason, and it makes no sense, but the good news is that the workaround is
easy, and yes new Immediate is just as fast, and the weird super-slow performance is gone.
(and the weird context switches and rss/heap discrepancy also go away)

If new Immediate() stores to this a passed-in function, the code runs slow.
If it's not passed in, or if the callback is not stored, the call runs fast. The args
parameter can be passed and assigned without penalty. Annotating the created object
is very fast and also incurs no penalty. (I have not been able to replicate this anomaly
standalone.)

Measuring small runs of tasks with a tool that subtracts the timing loop overhead,
the difference between createImmediate and new Immediate is 7% or so, close enough.
The microbenchmarks use a much much larger dataset (2m and 4m vs 10k) which seems
to favor createImmediate (by 70%), but the smaller dataset is more likely.

Either way, the immediates processing rate is over 250% faster than before this change
(standalone microbenchmark) or 350% (small runs, overhead subtracted).

lib/timers.js
break;
// slow case
default:
+ var len = arguments.length;

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

Does the 1/2/3/more argument optimization really help with anything here?

@benjamingr

benjamingr Apr 29, 2016

Member

Does the 1/2/3/more argument optimization really help with anything here?

This comment has been minimized.

@Fishrock123

Fishrock123 Apr 29, 2016

Member

it's a good deal faster than iterating over arguments iirc

@Fishrock123

Fishrock123 Apr 29, 2016

Member

it's a good deal faster than iterating over arguments iirc

This comment has been minimized.

@andrasq

andrasq Apr 29, 2016

Contributor

correct, x = [1, 2, 3] runs faster than x = new Array(3), and setting the values is extra.
Some 50+% faster to create pre-populated arrays of fixed length 3. (In fact, [1], [1,2,3] and
[1,2,3,4,5,6,7,8,9] all run equally fast, while explicitly setting values is linear in array length.)

Normally either would be blazing fast, 75 million vs 48 million / second, but when tuning
code that itself runs at 4 million operations / second, the difference amounts to 3% of the total.
(the baseline was 1 million / second, where the difference was under 1%)

@andrasq

andrasq Apr 29, 2016

Contributor

correct, x = [1, 2, 3] runs faster than x = new Array(3), and setting the values is extra.
Some 50+% faster to create pre-populated arrays of fixed length 3. (In fact, [1], [1,2,3] and
[1,2,3,4,5,6,7,8,9] all run equally fast, while explicitly setting values is linear in array length.)

Normally either would be blazing fast, 75 million vs 48 million / second, but when tuning
code that itself runs at 4 million operations / second, the difference amounts to 3% of the total.
(the baseline was 1 million / second, where the difference was under 1%)

This comment has been minimized.

@benjamingr

benjamingr Apr 29, 2016

Member

Right, but we're not actually invoking anything here, is it still slow to pass arguments around to the same function where you can have this sort of dispatch anyway?

@benjamingr

benjamingr Apr 29, 2016

Member

Right, but we're not actually invoking anything here, is it still slow to pass arguments around to the same function where you can have this sort of dispatch anyway?

This comment has been minimized.

@andrasq

andrasq Apr 30, 2016

Contributor

so that's a great question, I checked, and the answer turns to be weird -- the "/more" case
(where the arguments are copied out) runs much faster. Like 3x faster. But the special-cased
0/1/2/3 args run 25-30% slower. I guess passing arguments still disables optimization.

@andrasq

andrasq Apr 30, 2016

Contributor

so that's a great question, I checked, and the answer turns to be weird -- the "/more" case
(where the arguments are copied out) runs much faster. Like 3x faster. But the special-cased
0/1/2/3 args run 25-30% slower. I guess passing arguments still disables optimization.

This comment has been minimized.

@andrasq

andrasq May 2, 2016

Contributor

@benjamingr I changed the three lines that copy the callback params out of arguments,
for an added 2x speedup of the "many" (more than 3) arguments handling. (And yes, it's
still faster to special-case the 1/2/3 handling, but 4/5 no longer, so 4 is the crossover.)

Turns out node v6 is much much faster to .apply args in an array that was dynamically
extended than if it was allocated to the exact size. Seems to have changed with v6;
v5 and v4 were faster to .apply arrays that were created the correct size.
The change is significant, now 4x slower than before.

From the commit message:

        // faster to copy args, but v6 very slow to .apply them:
        var argv = new Array(arguments.length - 1);   // exact size
        for (var i = 1; i < arguments.length; i++)
          argv[i - 1] = arguments[i];

        // slower to copy args, but v6 much faster to .apply them:
        var argv = new Array();                       // grow as needed
        for (var i = 1; i < arguments.length; i++)
          argv[i - 1] = arguments[i];
@andrasq

andrasq May 2, 2016

Contributor

@benjamingr I changed the three lines that copy the callback params out of arguments,
for an added 2x speedup of the "many" (more than 3) arguments handling. (And yes, it's
still faster to special-case the 1/2/3 handling, but 4/5 no longer, so 4 is the crossover.)

Turns out node v6 is much much faster to .apply args in an array that was dynamically
extended than if it was allocated to the exact size. Seems to have changed with v6;
v5 and v4 were faster to .apply arrays that were created the correct size.
The change is significant, now 4x slower than before.

From the commit message:

        // faster to copy args, but v6 very slow to .apply them:
        var argv = new Array(arguments.length - 1);   // exact size
        for (var i = 1; i < arguments.length; i++)
          argv[i - 1] = arguments[i];

        // slower to copy args, but v6 much faster to .apply them:
        var argv = new Array();                       // grow as needed
        for (var i = 1; i < arguments.length; i++)
          argv[i - 1] = arguments[i];
@andrasq

This comment has been minimized.

Show comment
Hide comment
@andrasq

andrasq Apr 29, 2016

Contributor

pushed edits (some more consts)

Contributor

andrasq commented Apr 29, 2016

pushed edits (some more consts)

@benjamingr

This comment has been minimized.

Show comment
Hide comment
@benjamingr

benjamingr Apr 30, 2016

Member

Minus new Immediate instead of createImmediate - looks good to me.

Thanks for this fix.

Member

benjamingr commented Apr 30, 2016

Minus new Immediate instead of createImmediate - looks good to me.

Thanks for this fix.

@andrasq

This comment has been minimized.

Show comment
Hide comment
@andrasq

andrasq May 1, 2016

Contributor

pushed edits (new Immediate)

Contributor

andrasq commented May 1, 2016

pushed edits (new Immediate)

@benjamingr

This comment has been minimized.

Show comment
Hide comment
@benjamingr

benjamingr May 1, 2016

Member

LGTM although I meant for new Immediate to take the arguments in the constructor.

Member

benjamingr commented May 1, 2016

LGTM although I meant for new Immediate to take the arguments in the constructor.

timers: speed up another 2x setImmediate() with > 3 args
  220% faster yet setImmediate with 4 args,
  240% with 5,
  260% with 6

Changed the code to copy arguments into a dynamically extended array
instead of preallocating the exact size.  It seems f.apply(obj, args)
runs much faster in node v6 if the args array was extended at run-time
as opposed to preallocated the correct size.

This behavior is new in node v6.0.0; v5.10.1 was 4x faster to apply an
args array created the exact size.  (Node v5 and v4 were the other way
around, faster to .apply static sized arrays, and slower if dynamically
grown.)

    // faster to copy args, but v6 slow to .apply them:
    var argv = new Array(arguments.length - 1);   // exact size
    for (var i = 1; i < arguments.length; i++)
      argv[i - 1] = arguments[i];

    // slower to copy args, but v6 much faster to .apply them:
    var argv = new Array();                       // grow as needed
    for (var i = 1; i < arguments.length; i++)
      argv[i - 1] = arguments[i];
@andrasq

This comment has been minimized.

Show comment
Hide comment
@andrasq

andrasq May 2, 2016

Contributor

I was looking at setting the fields in the constructor, but that still triggers the pessimal
10x slower runtime in some cases. Don't know why, but --trace-opt --trace-deopt (a great
feature that I just learned about, thanks mscdex!) shows thrashing in optimizing/deoptimizing
the linkedlist shift and remove functions.

When setting the callback in the constructor, this test takes 9.5x as long to run (400 vs 3800 ms):

function callme() { }
for (var i=0; i<1000000; i++) setImmediate(callme);
setImmediate(function() {});

I have not yet tested on other platforms.

(more detail: removing the final setImmediate runs fast. Calling it before the loop runs fast.
Calling a final setImmediate(callme) instead also runs fast. The runtime also shows 140% cpu,
ie node is actually using 5.3 seconds of cpu, and would be 13x slower on a single-core system.)

Contributor

andrasq commented May 2, 2016

I was looking at setting the fields in the constructor, but that still triggers the pessimal
10x slower runtime in some cases. Don't know why, but --trace-opt --trace-deopt (a great
feature that I just learned about, thanks mscdex!) shows thrashing in optimizing/deoptimizing
the linkedlist shift and remove functions.

When setting the callback in the constructor, this test takes 9.5x as long to run (400 vs 3800 ms):

function callme() { }
for (var i=0; i<1000000; i++) setImmediate(callme);
setImmediate(function() {});

I have not yet tested on other platforms.

(more detail: removing the final setImmediate runs fast. Calling it before the loop runs fast.
Calling a final setImmediate(callme) instead also runs fast. The runtime also shows 140% cpu,
ie node is actually using 5.3 seconds of cpu, and would be 13x slower on a single-core system.)

break;
// slow case
default:
- args = new Array(len - 1);
- for (i = 1; i < len; i++)
+ args = [arg1, arg2, arg3];

This comment has been minimized.

@martinheidegger

martinheidegger May 2, 2016

Instead of [arg1, ...., argn] wouldn't Array.of be a better choice? Array.of(arg1) ?

@martinheidegger

martinheidegger May 2, 2016

Instead of [arg1, ...., argn] wouldn't Array.of be a better choice? Array.of(arg1) ?

This comment has been minimized.

@andrasq

andrasq May 2, 2016

Contributor

not for places where performance matters; the Array.of() function call is 40x slower.

Array.of is a native function, and native (C++) functions are often slower than v8 javascript.
In this case, even copying out arguments[i] in a loop into a new Array would be 20x faster
than Array.of, and a compile-time array [ ... ] is 2x faster still.

The speed limit to native functions is the js - C++ divide; crossing from js into C++ is slow,
about 4 million calls / second.

@andrasq

andrasq May 2, 2016

Contributor

not for places where performance matters; the Array.of() function call is 40x slower.

Array.of is a native function, and native (C++) functions are often slower than v8 javascript.
In this case, even copying out arguments[i] in a loop into a new Array would be 20x faster
than Array.of, and a compile-time array [ ... ] is 2x faster still.

The speed limit to native functions is the js - C++ divide; crossing from js into C++ is slow,
about 4 million calls / second.

@benjamingr

This comment has been minimized.

Show comment
Hide comment
@benjamingr

benjamingr May 2, 2016

Member

LGTM on the PR itself.

I'd love @mscdex or @Fishrock123 to LGTM too, especially given Fishrock cleaned timers quite a lot recently.

Member

benjamingr commented May 2, 2016

LGTM on the PR itself.

I'd love @mscdex or @Fishrock123 to LGTM too, especially given Fishrock cleaned timers quite a lot recently.

@bengl bengl referenced this pull request May 6, 2016

Closed

doc: add timer classes #6206

2 of 2 tasks complete
lib/timers.js
+function Immediate() {
+ // assigning callback here in the constructor results in pessimal performance,

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

poor performance?

@Fishrock123

Fishrock123 May 6, 2016

Member

poor performance?

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

what happens if you initially assign those function to noop (function(){}) in the constructor?

@Fishrock123

Fishrock123 May 6, 2016

Member

what happens if you initially assign those function to noop (function(){}) in the constructor?

This comment has been minimized.

@andrasq

andrasq May 6, 2016

Contributor

worse than poor, it was a 10x slowdown due to optimization / deoptimization thrashing.

What were you thinking to suggest setting the fields to noop first?
That seems to allow the callback to be set without triggering the thrashing.

@andrasq

andrasq May 6, 2016

Contributor

worse than poor, it was a 10x slowdown due to optimization / deoptimization thrashing.

What were you thinking to suggest setting the fields to noop first?
That seems to allow the callback to be set without triggering the thrashing.

This comment has been minimized.

@andrasq

andrasq May 6, 2016

Contributor

reworded the comment to be more specific, mention thrashing instead

@andrasq

andrasq May 6, 2016

Contributor

reworded the comment to be more specific, mention thrashing instead

lib/timers.js
break;
}
+ var immediate = new Immediate();

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

const

This comment has been minimized.

@andrasq

andrasq May 6, 2016

Contributor

mscdex also suggested that, but making it const causes v8 to deoptimize the function.
Curiously, it's when the new Immediate is called below the switch that it deoptimizes,
allocating above the switch is ok.

timers/immediate.js thousands=2000 type=depth: ./node-var: 539.19 ./node-const: 490.39 ...... 9.95%
timers/immediate.js thousands=2000 type=depth1: ./node-var: 529.02 ./node-const: 483.8 ...... 9.35%
timers/immediate.js thousands=2000 type=breadth: ./node-var: 2378.1 ./node-const: 1568.7 ... 51.59%
timers/immediate.js thousands=2000 type=breadth1: ./node-var: 2120.1 ./node-const: 973.55 . 117.77%
timers/immediate.js thousands=2000 type=breadth4: ./node-var: 941.43 ./node-const: 668.64 .. 40.80%
timers/immediate.js thousands=2000 type=clear: ./node-var: 2247.3 ./node-const: 1004.1 .... 123.80%
@andrasq

andrasq May 6, 2016

Contributor

mscdex also suggested that, but making it const causes v8 to deoptimize the function.
Curiously, it's when the new Immediate is called below the switch that it deoptimizes,
allocating above the switch is ok.

timers/immediate.js thousands=2000 type=depth: ./node-var: 539.19 ./node-const: 490.39 ...... 9.95%
timers/immediate.js thousands=2000 type=depth1: ./node-var: 529.02 ./node-const: 483.8 ...... 9.35%
timers/immediate.js thousands=2000 type=breadth: ./node-var: 2378.1 ./node-const: 1568.7 ... 51.59%
timers/immediate.js thousands=2000 type=breadth1: ./node-var: 2120.1 ./node-const: 973.55 . 117.77%
timers/immediate.js thousands=2000 type=breadth4: ./node-var: 941.43 ./node-const: 668.64 .. 40.80%
timers/immediate.js thousands=2000 type=clear: ./node-var: 2247.3 ./node-const: 1004.1 .... 123.80%

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

@andrasq Could you add a comment about that then? Thanks!

@Fishrock123

Fishrock123 May 6, 2016

Member

@andrasq Could you add a comment about that then? Thanks!

This comment has been minimized.

@andrasq

andrasq May 7, 2016

Contributor

good idea; fixed

@andrasq

andrasq May 7, 2016

Contributor

good idea; fixed

+const list3 = L.create();
+assert.ok(L.isEmpty(list2));
+assert.ok(L.isEmpty(list3));
+assert.ok(list2 != list3);

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

maybe also add:

L.init(list);
assert.deepEqual(list2, list);

(I think that should pass)

@Fishrock123

Fishrock123 May 6, 2016

Member

maybe also add:

L.init(list);
assert.deepEqual(list2, list);

(I think that should pass)

This comment has been minimized.

@Fishrock123

Fishrock123 May 6, 2016

Member

Ideally, we'd block-scope the tests and the variables. For a later PR I suppose.

@Fishrock123

Fishrock123 May 6, 2016

Member

Ideally, we'd block-scope the tests and the variables. For a later PR I suppose.

This comment has been minimized.

@andrasq

andrasq May 6, 2016

Contributor

the assertion failed... which makes sense in hindsight, circular lists point to themselves,
ie will not be the same. The isEqual test above checks equivalence, ie that the circular
links are set up correctly.

@andrasq

andrasq May 6, 2016

Contributor

the assertion failed... which makes sense in hindsight, circular lists point to themselves,
ie will not be the same. The isEqual test above checks equivalence, ie that the circular
links are set up correctly.

@Fishrock123

This comment has been minimized.

Show comment
Hide comment
@Fishrock123

Fishrock123 May 6, 2016

Member

CI: https://ci.nodejs.org/job/node-test-pull-request/2527/


As a note, running this benchmark with thousands=10000 on node master causes the vm to run out of memory. Perhaps that is expected, I'm not 100% certain.

./node benchmark/timers/immediate.js thousands=10000
timers/immediate.js thousands=10000 type=depth: 468.94194
timers/immediate.js thousands=10000 type=depth1: 301.67210
timers/immediate.js thousands=10000 type=breadth: 2954.32823

<--- Last few GCs --->

   19800 ms: Mark-sweep 1380.9 (1434.9) -> 1380.9 (1434.9) MB, 1612.3 / 0 ms [allocation failure] [GC in old space requested].
   21523 ms: Mark-sweep 1380.9 (1434.9) -> 1380.8 (1434.9) MB, 1722.9 / 0 ms [allocation failure] [GC in old space requested].
   23219 ms: Mark-sweep 1380.8 (1434.9) -> 1380.8 (1434.9) MB, 1696.0 / 0 ms [last resort gc].
   24814 ms: Mark-sweep 1380.8 (1434.9) -> 1380.8 (1434.9) MB, 1595.7 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x9740fbc0dc1 <JS Object>
    1: /* anonymous */(aka /* anonymous */) [timers.js:~566] [pc=0x182a0d2651a3] (this=0x9740fb04189 <undefined>,callback=0xaf774732489 <JS Function cb (SharedFunctionInfo 0x1d4be9d94379)>,arg1=1,arg2=0x9740fb04189 <undefined>,arg3=0x9740fb04189 <undefined>)
    2: arguments adaptor frame: 2->4
    3: breadth1(aka breadth1) [/Users/Jeremiah/Documents/node/benchmark/timers/immediate.js:~76] [pc=0...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
Member

Fishrock123 commented May 6, 2016

CI: https://ci.nodejs.org/job/node-test-pull-request/2527/


As a note, running this benchmark with thousands=10000 on node master causes the vm to run out of memory. Perhaps that is expected, I'm not 100% certain.

./node benchmark/timers/immediate.js thousands=10000
timers/immediate.js thousands=10000 type=depth: 468.94194
timers/immediate.js thousands=10000 type=depth1: 301.67210
timers/immediate.js thousands=10000 type=breadth: 2954.32823

<--- Last few GCs --->

   19800 ms: Mark-sweep 1380.9 (1434.9) -> 1380.9 (1434.9) MB, 1612.3 / 0 ms [allocation failure] [GC in old space requested].
   21523 ms: Mark-sweep 1380.9 (1434.9) -> 1380.8 (1434.9) MB, 1722.9 / 0 ms [allocation failure] [GC in old space requested].
   23219 ms: Mark-sweep 1380.8 (1434.9) -> 1380.8 (1434.9) MB, 1696.0 / 0 ms [last resort gc].
   24814 ms: Mark-sweep 1380.8 (1434.9) -> 1380.8 (1434.9) MB, 1595.7 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x9740fbc0dc1 <JS Object>
    1: /* anonymous */(aka /* anonymous */) [timers.js:~566] [pc=0x182a0d2651a3] (this=0x9740fb04189 <undefined>,callback=0xaf774732489 <JS Function cb (SharedFunctionInfo 0x1d4be9d94379)>,arg1=1,arg2=0x9740fb04189 <undefined>,arg3=0x9740fb04189 <undefined>)
    2: arguments adaptor frame: 2->4
    3: breadth1(aka breadth1) [/Users/Jeremiah/Documents/node/benchmark/timers/immediate.js:~76] [pc=0...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
@Fishrock123

This comment has been minimized.

Show comment
Hide comment
@Fishrock123

Fishrock123 May 6, 2016

Member

./node benchmark/compare -r -g ./node ./node-master -- timers immediate

timers/immediate.js thousands=2000 type=depth: ./node: 529.2 ./node-master: 511.12 ..... 3.54%
timers/immediate.js thousands=2000 type=depth1: ./node: 512.93 ./node-master: 331.46 .. 54.75%
timers/immediate.js thousands=2000 type=breadth: ./node: 2710.9 ./node-master: 2768 ... -2.06%
timers/immediate.js thousands=2000 type=breadth1: ./node: 2461.8 ./node-master: 1057 . 132.90%
timers/immediate.js thousands=2000 type=breadth4: ./node: 1218 ./node-master: 625.44 .. 94.74%
timers/immediate.js thousands=2000 type=clear: ./node: 2680.8 ./node-master: 1017.2 .. 163.54%
Member

Fishrock123 commented May 6, 2016

./node benchmark/compare -r -g ./node ./node-master -- timers immediate

timers/immediate.js thousands=2000 type=depth: ./node: 529.2 ./node-master: 511.12 ..... 3.54%
timers/immediate.js thousands=2000 type=depth1: ./node: 512.93 ./node-master: 331.46 .. 54.75%
timers/immediate.js thousands=2000 type=breadth: ./node: 2710.9 ./node-master: 2768 ... -2.06%
timers/immediate.js thousands=2000 type=breadth1: ./node: 2461.8 ./node-master: 1057 . 132.90%
timers/immediate.js thousands=2000 type=breadth4: ./node: 1218 ./node-master: 625.44 .. 94.74%
timers/immediate.js thousands=2000 type=clear: ./node: 2680.8 ./node-master: 1017.2 .. 163.54%

Fishrock123 added a commit that referenced this pull request Jun 29, 2016

benchmark: add `setImmediate()` benchmarks
Timings for sequential and concurren setImmediate() with and without
arguments, and set + clearImmediate().

PR-URL: #6436
Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>

@Fishrock123 Fishrock123 referenced this pull request Jun 29, 2016

Merged

Upgrade cpplint #7462

@Fishrock123

This comment has been minimized.

Show comment
Hide comment
@Fishrock123

Fishrock123 Jun 29, 2016

Member

Landed in 9beef23...fba271b ..... ish... see #7462 (comment)

... Strange, the actual commits are f8d3f6f...fba271b

Member

Fishrock123 commented Jun 29, 2016

Landed in 9beef23...fba271b ..... ish... see #7462 (comment)

... Strange, the actual commits are f8d3f6f...fba271b

@mjsalinger

This comment has been minimized.

Show comment
Hide comment
@mjsalinger

mjsalinger Jun 30, 2016

@Fishrock123 Curious on this as the commits merged into master don't appear to include changes made based on code reviews..

@Fishrock123 Curious on this as the commits merged into master don't appear to include changes made based on code reviews..

Fishrock123 added a commit that referenced this pull request Jul 5, 2016

timers: optimize linkedlist
Now uses a new L.create() factory to create access-optimized linkedlist
objects.

PR-URL: #6436
Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>

Fishrock123 added a commit that referenced this pull request Jul 5, 2016

timers: optimize `setImmediate()`
Save the setImmediate() callback arguments into an array instead of a
closure, and invoke the callback on the arguments from an optimizable
function.

  60% faster setImmediate with 0 args (15% if self-recursive)
  4x faster setImmediate with 1-3 args, 2x with > 3
  seems to be faster with less memory pressure when memory is tight

Changes:
- use L.create() to build faster lists
- use runCallback() from within tryOnImmediate()
- save the arguments and do not build closures for the callbacks

PR-URL: #6436
Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>

Fishrock123 added a commit that referenced this pull request Jul 5, 2016

benchmark: add `setImmediate()` benchmarks
Timings for sequential and concurren setImmediate() with and without
arguments, and set + clearImmediate().

PR-URL: #6436
Reviewed-By: Benjamin Gruenbaum <benjamingr@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>

@Fishrock123 Fishrock123 referenced this pull request Jul 5, 2016

Merged

Propose v6.3.0 (v2) #7550

Fishrock123 added a commit that referenced this pull request Jul 6, 2016

2016-07-06, Version 6.3.0 (Current)
Notable changes:

* buffer: Added `buffer.swap64()` to compliment `swap16()` &
`swap32()`. (Zach Bjornson) #7157
* build: New `configure` options have been added for building Node.js
as a shared library. (Stefan Budeanu)
#6994
  - The options are: `--shared`, `--without-v8-platform` &
`--without-bundled-v8`.
* crypto: Root certificates have been updated. (Ben Noordhuis)
#7363
* debugger: The server address is now configurable via
`--debug=<address>:<port>`. (Ben Noordhuis)
#3316
* npm: Upgraded npm to v3.10.3 (Kat Marchán)
#7515 & (Rebecca Turner)
#7410
* readline: Added the `prompt` option to the readline constructor.
(Evan Lucas) #7125
* repl / vm: `sigint`/`ctrl+c` will now break out of infinite loops
without stopping the Node.js instance. (Anna Henningsen)
#6635
* src:
  - Added a `node::FreeEnvironment` public C++ API. (Cheng Zhao)
#3098
  - Refactored `require('constants')`, constants are now available
directly from their respective modules. (James M Snell)
#6534
* stream: Improved `readable.read()` performance by up to 70%. (Brian
White) #7077
* timers: `setImmediate()` is now up to 150% faster in some situations.
(Andras) #6436
* util: Added a `breakLength` option to `util.inspect()` to control how
objects are formatted across lines. (cjihrig)
#7499
* v8-inspector: Experimental support has been added for debugging
Node.js over the inspector protocol. (Ali Ijaz Sheikh)
#6792
  - *Note: This feature is experimental, and it could be altered or
removed.*
  - You can try this feature by running Node.js with the `--inspect`
flag.

Refs: #7441
PR-URL: #7550

Fishrock123 added a commit that referenced this pull request Jul 6, 2016

2016-07-06, Version 6.3.0 (Current)
Notable changes:

* buffer: Added `buffer.swap64()` to compliment `swap16()` &
`swap32()`. (Zach Bjornson) #7157
* build: New `configure` options have been added for building Node.js
as a shared library. (Stefan Budeanu)
#6994
  - The options are: `--shared`, `--without-v8-platform` &
`--without-bundled-v8`.
* crypto: Root certificates have been updated. (Ben Noordhuis)
#7363
* debugger: The server address is now configurable via
`--debug=<address>:<port>`. (Ben Noordhuis)
#3316
* npm: Upgraded npm to v3.10.3 (Kat Marchán)
#7515 & (Rebecca Turner)
#7410
* readline: Added the `prompt` option to the readline constructor.
(Evan Lucas) #7125
* repl / vm: `sigint`/`ctrl+c` will now break out of infinite loops
without stopping the Node.js instance. (Anna Henningsen)
#6635
* src:
  - Added a `node::FreeEnvironment` public C++ API. (Cheng Zhao)
#3098
  - Refactored `require('constants')`, constants are now available
directly from their respective modules. (James M Snell)
#6534
* stream: Improved `readable.read()` performance by up to 70%. (Brian
White) #7077
* timers: `setImmediate()` is now up to 150% faster in some situations.
(Andras) #6436
* util: Added a `breakLength` option to `util.inspect()` to control how
objects are formatted across lines. (cjihrig)
#7499
* v8-inspector: Experimental support has been added for debugging
Node.js over the inspector protocol. (Ali Ijaz Sheikh)
#6792
  - *Note: This feature is experimental, and it could be altered or
removed.*
  - You can try this feature by running Node.js with the `--inspect`
flag.

Refs: #7441
PR-URL: #7550

Fishrock123 added a commit that referenced this pull request Jul 6, 2016

2016-07-06, Version 6.3.0 (Current)
Notable changes:

* buffer: Added `buffer.swap64()` to compliment `swap16()` &
`swap32()`. (Zach Bjornson) #7157
* build: New `configure` options have been added for building Node.js
as a shared library. (Stefan Budeanu)
#6994
  - The options are: `--shared`, `--without-v8-platform` &
`--without-bundled-v8`.
* crypto: Root certificates have been updated. (Ben Noordhuis)
#7363
* debugger: The server address is now configurable via
`--debug=<address>:<port>`. (Ben Noordhuis)
#3316
* npm: Upgraded npm to v3.10.3 (Kat Marchán)
#7515 & (Rebecca Turner)
#7410
* readline: Added the `prompt` option to the readline constructor.
(Evan Lucas) #7125
* repl / vm: `sigint`/`ctrl+c` will now break out of infinite loops
without stopping the Node.js instance. (Anna Henningsen)
#6635
* src:
  - Added a `node::FreeEnvironment` public C++ API. (Cheng Zhao)
#3098
  - Refactored `require('constants')`, constants are now available
directly from their respective modules. (James M Snell)
#6534
* stream: Improved `readable.read()` performance by up to 70%. (Brian
White) #7077
* timers: `setImmediate()` is now up to 150% faster in some situations.
(Andras) #6436
* util: Added a `breakLength` option to `util.inspect()` to control how
objects are formatted across lines. (cjihrig)
#7499
* v8-inspector: Experimental support has been added for debugging
Node.js over the inspector protocol. (Ali Ijaz Sheikh)
#6792
  - *Note: This feature is experimental, and it could be altered or
removed.*
  - You can try this feature by running Node.js with the `--inspect`
flag.

Refs: #7441
PR-URL: #7550

@chrjean chrjean referenced this pull request in lukesampson/scoop Jul 7, 2016

Merged

Update nodejs to 6.3.0 #948

lukesampson added a commit to lukesampson/scoop that referenced this pull request Jul 7, 2016

Update nodejs to 6.3.0 (#948)
### Notable changes

* **buffer**: Added `buffer.swap64()` to compliment `swap16()` & `swap32()`. (Zach Bjornson) [#7157](nodejs/node#7157)
* **build**: New `configure` options have been added for building Node.js as a shared library. (Stefan Budeanu) [#6994](nodejs/node#6994)
  - The options are: `--shared`, `--without-v8-platform` & `--without-bundled-v8`.
* **crypto**: Root certificates have been updated. (Ben Noordhuis) [#7363](nodejs/node#7363)
* **debugger**: The server address is now configurable via `--debug=<address>:<port>`. (Ben Noordhuis) [#3316](nodejs/node#3316)
* **npm**: Upgraded npm to v3.10.3 (Kat Marchán) [#7515](nodejs/node#7515) & (Rebecca Turner) [#7410](nodejs/node#7410)
* **readline**: Added the `prompt` option to the readline constructor. (Evan Lucas) [#7125](nodejs/node#7125)
* **repl / vm**: `sigint`/`ctrl+c` will now break out of infinite loops without stopping the Node.js instance. (Anna Henningsen) [#6635](nodejs/node#6635)
* **src**:
  - Added a `node::FreeEnvironment` public C++ API. (Cheng Zhao) [#3098](nodejs/node#3098)
  - Refactored `require('constants')`, constants are now available directly from their respective modules. (James M Snell) [#6534](nodejs/node#6534)
* **stream**: Improved `readable.read()` performance by up to 70%. (Brian White) [#7077](nodejs/node#7077)
* **timers**: `setImmediate()` is now up to 150% faster in some situations. (Andras) [#6436](nodejs/node#6436)
* **util**: Added a `breakLength` option to `util.inspect()` to control how objects are formatted across lines. (cjihrig) [#7499](nodejs/node#7499)
* **v8-inspector**: Experimental support has been added for debugging Node.js over the inspector protocol. (Ali Ijaz Sheikh) [#6792](nodejs/node#6792)
  - **Note: This feature is _experimental_, and it could be altered or removed.**
  - You can try this feature by running Node.js with the `--inspect` flag.
@andrasq

This comment has been minimized.

Show comment
Hide comment
@andrasq

andrasq Jul 10, 2016

Contributor

@mjsalinger perhaps some of the commits got squashed; all the expected edits are there.
The 4 (less 1) original + 8 code review change sets got merged as 3 commits.

Contributor

andrasq commented Jul 10, 2016

@mjsalinger perhaps some of the commits got squashed; all the expected edits are there.
The 4 (less 1) original + 8 code review change sets got merged as 3 commits.

@MylesBorins

This comment has been minimized.

Show comment
Hide comment
@MylesBorins

MylesBorins Jul 11, 2016

Member

I'm setting this as don't land. please change to LTS watch if you think this should land in LTS

Member

MylesBorins commented Jul 11, 2016

I'm setting this as don't land. please change to LTS watch if you think this should land in LTS

@Fishrock123

This comment has been minimized.

Show comment
Hide comment
@Fishrock123

Fishrock123 Jul 12, 2016

Member

@thealphanerd if it cleanly applies you should be able to land it

Member

Fishrock123 commented Jul 12, 2016

@thealphanerd if it cleanly applies you should be able to land it

@MylesBorins

This comment has been minimized.

Show comment
Hide comment
@MylesBorins

MylesBorins Aug 30, 2016

Member

@Fishrock123 it does not land cleanly without other timers changes

Member

MylesBorins commented Aug 30, 2016

@Fishrock123 it does not land cleanly without other timers changes

@Fishrock123

This comment has been minimized.

Show comment
Hide comment
@Fishrock123

Fishrock123 Aug 30, 2016

Member

@thealphanerd ok, probably not important unless it is needed for other fixes. (Although it would be nice.)

Member

Fishrock123 commented Aug 30, 2016

@thealphanerd ok, probably not important unless it is needed for other fixes. (Although it would be nice.)

@jedireza jedireza referenced this pull request Oct 31, 2016

Closed

lib: change == to === in linkedlist #9362

2 of 2 tasks complete

Fishrock123 added a commit to Fishrock123/node that referenced this pull request Dec 9, 2016

timers: cleanup extraneous property on Immediates
This was originally changed in 6f75b66
but it appears unnecessary and benhcmark results show little difference
without the extra property.

Refs: nodejs#6436

@Fishrock123 Fishrock123 referenced this pull request Dec 9, 2016

Closed

timers: cleanup extraneous property on Immediates #10205

3 of 3 tasks complete
@zhoujinhai

This comment has been minimized.

Show comment
Hide comment
@zhoujinhai

zhoujinhai Aug 1, 2017

@Fishrock123 I have the same problem , did you save it?

<--- Last few GCs --->

[9852:000001B18947FDB0]   770113 ms: Mark-sweep 1401.2 (1457.9) -> 1398.7 (1457.9) MB, 16.4 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 18 ms) allocation failure GC in old space requested
[9852:000001B18947FDB0]   770136 ms: Mark-sweep 1398.7 (1457.9) -> 1398.6 (1426.4) MB, 21.0 / 0.0 ms  last resort
[9852:000001B18947FDB0]   770158 ms: Mark-sweep 1398.6 (1426.4) -> 1398.6 (1425.9) MB, 22.5 / 0.0 ms  last resort


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0000003FB9CA9891 <JS Object>
    2: new constructor(aka Request) [E:\nodecrawler\node_modules\request\request.js:128] [pc=0000004DC395835B](this=000000583D4E4AD1 <a Request with map 0000035C122A1B51>,options=000000583D4E4DC9 <an Object with map 0000025475FDD539>)
    4: request(aka request) [E:\nodecrawler\node_modules\request\index.js:54] [pc=0000004DC3957474](this=000000530CD82311 <undefined>,uri=000000583D4E8241 <an O...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
my code is here:
var crawler = require('crawler');
var config = require('./config');
var fs = require('fs');
// var videoList = config.videoList;
//var read = require('./read');
var debug = require('debug')('nightmare:crawler');

console.time("程序运行时间");//开始运行时间
//从videoList文件中获取视频列表
var read = fs.readFileSync(__dirname+'/videoList.json',function(err){
	if(err){
		console.log(err);
	}
	debug('读取videoList文件');
	console.log('successed!');
});
var videoList = JSON.parse(read);

var informationList = [];
var c = new crawler({
	// maxConnection : 1000,
	//rateLimit : 1000,
	forceUTF8 : true,
	callback : function(error,res,done){
		if(error){
			console.log(error);
		}else{	
			information(res,done);
			done();
		}
		// console.log(informationList);
		// console.log(informationList.length);
	}
});

// //爬取订阅号列表

for(i=0;i<videoList.length;i++){
	c.queue({
		uri: videoList[i].url,
		proxy: 'http://127.0.0.1:61481'
	});
}

@Fishrock123 I have the same problem , did you save it?

<--- Last few GCs --->

[9852:000001B18947FDB0]   770113 ms: Mark-sweep 1401.2 (1457.9) -> 1398.7 (1457.9) MB, 16.4 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 18 ms) allocation failure GC in old space requested
[9852:000001B18947FDB0]   770136 ms: Mark-sweep 1398.7 (1457.9) -> 1398.6 (1426.4) MB, 21.0 / 0.0 ms  last resort
[9852:000001B18947FDB0]   770158 ms: Mark-sweep 1398.6 (1426.4) -> 1398.6 (1425.9) MB, 22.5 / 0.0 ms  last resort


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0000003FB9CA9891 <JS Object>
    2: new constructor(aka Request) [E:\nodecrawler\node_modules\request\request.js:128] [pc=0000004DC395835B](this=000000583D4E4AD1 <a Request with map 0000035C122A1B51>,options=000000583D4E4DC9 <an Object with map 0000025475FDD539>)
    4: request(aka request) [E:\nodecrawler\node_modules\request\index.js:54] [pc=0000004DC3957474](this=000000530CD82311 <undefined>,uri=000000583D4E8241 <an O...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
my code is here:
var crawler = require('crawler');
var config = require('./config');
var fs = require('fs');
// var videoList = config.videoList;
//var read = require('./read');
var debug = require('debug')('nightmare:crawler');

console.time("程序运行时间");//开始运行时间
//从videoList文件中获取视频列表
var read = fs.readFileSync(__dirname+'/videoList.json',function(err){
	if(err){
		console.log(err);
	}
	debug('读取videoList文件');
	console.log('successed!');
});
var videoList = JSON.parse(read);

var informationList = [];
var c = new crawler({
	// maxConnection : 1000,
	//rateLimit : 1000,
	forceUTF8 : true,
	callback : function(error,res,done){
		if(error){
			console.log(error);
		}else{	
			information(res,done);
			done();
		}
		// console.log(informationList);
		// console.log(informationList.length);
	}
});

// //爬取订阅号列表

for(i=0;i<videoList.length;i++){
	c.queue({
		uri: videoList[i].url,
		proxy: 'http://127.0.0.1:61481'
	});
}

targos added a commit to targos/node that referenced this pull request Oct 21, 2017

timers: cleanup extraneous property on Immediates
This was originally changed in 6f75b66
but it appears unnecessary and benhcmark results show little difference
without the extra property.

Refs: nodejs#6436

targos added a commit that referenced this pull request Oct 23, 2017

timers: cleanup extraneous property on Immediates
This was originally changed in 6f75b66
but it appears unnecessary and benhcmark results show little difference
without the extra property.

Refs: #6436
PR-URL: #16355
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Anatoli Papirovski <apapirovski@mac.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Refael Ackermann <refack@gmail.com>

addaleax added a commit to ayojs/ayo that referenced this pull request Oct 26, 2017

timers: cleanup extraneous property on Immediates
This was originally changed in 6f75b66
but it appears unnecessary and benhcmark results show little difference
without the extra property.

Refs: nodejs/node#6436
PR-URL: nodejs/node#16355
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Anatoli Papirovski <apapirovski@mac.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Refael Ackermann <refack@gmail.com>

addaleax added a commit to ayojs/ayo that referenced this pull request Dec 7, 2017

timers: cleanup extraneous property on Immediates
This was originally changed in 6f75b66
but it appears unnecessary and benhcmark results show little difference
without the extra property.

Refs: nodejs/node#6436
PR-URL: nodejs/node#16355
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
Reviewed-By: Jeremiah Senkpiel <fishrock123@rocketmail.com>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: Anatoli Papirovski <apapirovski@mac.com>
Reviewed-By: James M Snell <jasnell@gmail.com>
Reviewed-By: Refael Ackermann <refack@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment