Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not for of? #1

Open
qm3ster opened this issue May 23, 2020 · 4 comments
Open

Why not for of? #1

qm3ster opened this issue May 23, 2020 · 4 comments

Comments

@qm3ster
Copy link

qm3ster commented May 23, 2020

Would defining the functions like this instead impact performance?

function* map(it, fn) {
  for (const x of it) yield fn(x);
}
function* take(it, n) {
  for (let i = 0; i < n; i++) yield it.next().value;
}
function* skip(it, n) {
  for (let i = 0; i < n; i++) it.next();
  for (const x of it) yield x;
}
function first(it) {
  return it.next().value;
}
function* filter(it, p) {
  for (const x of it) if (p(x)) yield fn(x);
}
function* from_range(s, e = Infinity) {
  for (let i = s; i < e; i++) yield i;
}
function* from_fn(fn, s) {
  let i = 0;
  while (true) yield (s = fn(i++, s));
}
function reduce(it, fn, acc) {
  for (const x of it) acc = fn(acc, x);
  return acc;
}
function* chain(...its) {
  for (const it of its) for (const x of it) yield x;
}
const pipe = (...fns) => (x) => {
  for (const fn of fns) x = fn(x);
  return x;
};
const c = (fn) => (...args) => (x) => fn(x, ...args);

const line = pipe(
  c(from_fn)(),
  c(map)((x) => (x * 100) | 0),
  c(take)(3),
  c(chain)(["arrays", "are", "iterators"]),
  c(map)(String),
  c(reduce)((a, b) => a + b, ""),
  c(map)((x) => x + x + x + x + "\n") // strings are iterators
);
console.log(...line(Math.random));
@WimJongeneel
Copy link
Owner

I didn't run any tests, but my gut says:

  • on small collections this is faster due to having less iterators (which have a considerable overhead)
  • on big collections it is slower because the ... operator (shallowly) clones all the items in the collection which at some point will cost more resources then the gains of having less iterators

But this is just speculation, the only way to know for sure it to test a variety of cases. Properly will also depend on the runtime you are using as iterators are a recent addition and I expect more modern runtimes to have more efficient implementations of them.

@qm3ster
Copy link
Author

qm3ster commented May 25, 2020

Not sure what you mean, I didn't really use ... much. Only in arguments, but I could as well take arrays in those two cases.

Oh, if you mean c the currying function, then sure, it's probably bad news, and I should just rewrite the functions as eg

const filter = p => function* filter(it) {
  for (const x of it) if (p(x)) yield fn(x);
}

But I meant the fact that you manually called .next() a lot.
I'd assume the for optimizes better, even for manual implementations of iterator?

@dvester
Copy link

dvester commented Feb 20, 2021

So I tested the difference between using for of, and interacting with the iterator via next() and the difference was huge. In favor of the direct interaction. Using the for of was significantly slower for even small arrays. It was odd because when using for of it didn't seem to benefit from the deferred execution at all. So as the original array size increased, so did the time it took to execute the method chain. The same was not true when using direct interaction with the next() function.

@XantreDev
Copy link

Lets make benchmark

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants