Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

Fix "Maximum call stack size exceeded" warnings when processing lots of #8

Open
wants to merge 1 commit into from

2 participants

mcarlson Dominic Tarr
mcarlson

items

Dominic Tarr
Owner

if you are getting a stack overflow, it must mean that you are using a sync transform?
If you are not using an async map, you are probably better off just using through it's much simpler.

mcarlson

One of my transforms was indeed synchronous. I moved it out of the async map and it's more reliable - see https://github.com/mcarlson/punitive/blob/async/punitive.js#L81. I still get "RangeError: Maximum call stack size exceeded" for certain large datasets. In this case, 22844 items in the 'filtered' array. It's all 100% async now, so that's not the issue.

I haven't been able to find a workaround for this other than the pull request I sent. Any other ideas?

Still stuck here, even after wrapping the callback in process.nextTick() myself...

Dominic Tarr
Owner

fs.readFile isn't sync. hmm, you shouldn't need process.nextTick
if it's in the callback of fs.readFile are you sure this only works if it's there?

hmm. It looks like you are making a key value database, except with files?

Oh... I just looked it your code and saw you also had a leveldb branch.
do you have the same problem with leveldb? I was about to recommend using leveldb for this.

mcarlson

I know, right? It should be working. Still, unless I patch map-stream directly, the issue occurs. I'm sure this will be an ongoing issue for other users... Check out this issue where the author of async throws up his hands and adds process.nextTick: caolan/async#75

I love event-stream, it's beautiful stuff. I'd love to keep using it and know that it will always reliably work. What are your reservations about adding process.nextTick()? Are you concerned about process.nextTick() degrading performance?

FWIW, LevelDB works but the performance isn't nearly as good. It's hard to beat the raw filesystem for speed. That said, leveldb does mostly work - but it's really not designed for appending to values as I'm doing with the lovely fs.appendFile() so some values get dropped on the floor. I'd need to add key-specific locking or somesuch to levelup make it work reliably.

Ahhh, life on the bleeding edge, huh?

Dominic Tarr
Owner

Okay so I don't want to merge a work around,
I want to get to the root of the problem.
Not just to fix map-stream, but also to understand node.

Can you give me a testcase that reproduces this?

Hmm, so you are using leveldb the best way.
You can do something just like appends with leveldb,
just make the key phon+'!'+word. And then pull the values out with a read stream.
Also, use level-writestream.
You should post an issue on leveldb though, and well discuss that there.

mcarlson

Sure, if you follow the instructions to run https://github.com/mcarlson/punitive/tree/async you'll see if blows up every time without my patch..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
This page is out of date. Refresh to see the latest.
Showing with 4 additions and 1 deletion.
  1. +4 −1 index.js
5 index.js
View
@@ -48,7 +48,10 @@ module.exports = function (mapper) {
if (writeQueue.hasOwnProperty(nextToWrite)) {
var dataToWrite = writeQueue[nextToWrite]
delete writeQueue[nextToWrite]
- return queueData(dataToWrite, nextToWrite)
+ process.nextTick(function() {
+ queueData(dataToWrite, nextToWrite);
+ })
+ return
}
outputs ++
Something went wrong with that request. Please try again.