Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory issue #21

Closed
funston opened this issue Apr 27, 2011 · 2 comments
Closed

memory issue #21

funston opened this issue Apr 27, 2011 · 2 comments

Comments

@funston
Copy link

funston commented Apr 27, 2011

Trying to debug. So not sure if this is a bug, or maybe ideas to fix. I have a basic app, and parser setup, loading a 300MB xml file.

Getting the following error I believe from inside the parser somewhere:

FATAL ERROR: JS Allocation failed - process out of memory

Just curious if you've tested anything large, before I dig into my code/parsing, although I'm not hitting even the first open tag before memory exhausts. I can post a working gist, but essentially this is what's going on :

var fs = require('fs');
var parser = sax.parser();

var fileStream = fs.createReadStream(process.argv[2],
{'bufferSize': 4 * 1024}
);
var xml = '';
fileStream.addListener("data",function(chunk){
xml += chunk;
});
fileStream.addListener("close", function(){
console.log("file read into xml");
parser.write(xml).close();
});

tnks

@funston
Copy link
Author

funston commented Apr 27, 2011

btw, i also tried to just write "chunks" to the parser inside the file read stream, but getting this error:

/usr/local/lib/node/.npm/sax/0.1.2/package/lib/sax.js:257
while (parser.c = c = chunk.charAt(i++)) {

"write - Write bytes onto the stream. You don't have to do this all at once. You can keep writing as much as you want".

do you have a simple example of not writing an entire buffer but writing chunks to to the parser. perhaps that will help with the memory issue?

@funston
Copy link
Author

funston commented Apr 27, 2011

okay, i plowed away and saw the prettyprint example, i noticed i wasn't passing the encoding of utf8 into the filestream, and with that the parser error above goes away (as does the memory issue if i write in chunks, not the entire buf).

sorted.

@funston funston closed this as completed Apr 27, 2011
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant