JS Allocation failed - process out of memory #3340

cliffano opened this Issue Apr 12, 2013 · 30 comments


None yet

I got "JS Allocation failed - process out of memory" error when trying to npm publish a large module (80Mb) with npm v1.2.18 .

The problem did not occur with npm v1.2.11 .


We are also seeing this problem come up sometimes with npm 1.2.15 with 50Mb tarball.


happens a lot for me as well, the module which causes it quite often as well, when installing a large module (ie tzwhere which includes allot of information about timezones)

npm member

@sideshowcoder I can reproduce the problem with tzwhere, but I found that I can work around it by downloading the tarball and installing from that directly. That is:

$ curl -o tzwhere.tgz `npm info tzwhere dist.tarball`
$ npm i tzwhere.tgz

I believe the cause of the problem in this particular case is a 60MB JSON file (tz_world.json), which is probably being parsed when it doesn't need to be. That needs more digging than I have time for right now, though.


It works by installing it by itself as well, so no need to download it first. But for me it will fail in case there are more things specified in the package.json. The JSON file surly should not be parsed, any hints on where to look if this is the case? I can probably check it out myself.


I just upgraded to node v0.10.12 and npm v1.2.32, and it seems to fix the problem.

@taromorimoto Do you still get the error on the 50Mb tarball?

Edit: correction, it's now intermittent for me on v1.2.32, it constantly failed on v1.2.18, it constantly worked on v1.2.11 .


I get the same issue with poker-evaluator


node v0.10.20 + npm v1.3.11 is giving me this problem as well, for a github repository with multiple zip files inside ~16MB.


I have a similiar issue FATAL ERROR: JS Allocation failed - process out of memory when trying to npm install a 12MB .tgz package.

node --version
npm --version

on CentOS 6.5 64bit.

Strangely enough, installing the package works fine after manually running tar xfvz.


Was there any resolution to this issue? I'm running a private registry dealing with a package that is 203MB (81MB as a tar-ball) and npm is routinely (~40% of the time) running out of memory on both publish and install of the package.

The memory usage swells up to over 1GB, which seems strange since that's over 4 times the uncompressed package size.

Is there some way to force npm to call node with a --max-old-space-size override? I imagine this would avoid the problem on 64bit machines at least...

node --version
npm --version

on OSX 10.8.5 with 64bit mac

npm member

I get this pretty reproducibly running npm install within the npmjs.org repo, where nothing in that dependency tree should be using a ton of memory.

node --version
npm --version
npm member

@terinjokes Weird, can you run wiht -ddd and paste the output? (Presumably it's crashing before it can write a npm-debug.log file?)

npm member

Affirmative, and I can't get the kernel to write out the core dump.

npm member

Nevermind, I can reproduce. Super strange.

npm member

Seems like this happens the first time, but then the second time it works. I think it's that old bug in node-tar where it hangs onto big files in memory inappropriately, coupled with the fact that something is using jsontool@4.0.1 (latest is 7.x).

npm member

Ah. I'm running it as part of the docker stuff, so it failing the first time fails the entire build.

npm member

Even when I run interactively multiple times, i still fail extracting jsontool:

$ npm install
npm WARN package.json npmjs.org@2.0.3 No repository field.
npm http GET http://registry.npmjs.org/couchapp
npm http GET http://registry.npmjs.org/semver
npm http GET http://registry.npmjs.org/rimraf
npm http GET http://registry.npmjs.org/tap
npm http GET http://registry.npmjs.org/jsontool
npm http 304 http://registry.npmjs.org/semver
npm http 304 http://registry.npmjs.org/couchapp
npm http 304 http://registry.npmjs.org/tap
npm http 304 http://registry.npmjs.org/jsontool
npm http GET http://registry.npmjs.org/jsontool/-/jsontool-4.0.1.tgz
npm http 200 http://registry.npmjs.org/jsontool/-/jsontool-4.0.1.tgz
FATAL ERROR: JS Allocation failed - process out of memory
npm member

That something that is using the old jsontool is npmjs.org. Submitted a PR at npm/npm-registry-couchapp#153.

npm member

Oh, wait, srsly? Ha.

Well... it also shouldn't be crashing npm, so that's definitely a bug.


We have similar issue on our Travis build: https://travis-ci.org/Ecodev/gims/jobs/20133256.

Hard to say what's wrong without more details, but could it be the extraction of the archive... ?

@redaxmedia redaxmedia referenced this issue in yaniswang/grunt-htmlhint Mar 29, 2014

Timeout while fetching your dependencies #7


Have the problem by running a npm search readme query.

$ npm search readme --verbose
npm info it worked if it ends with ok
npm verb cli [ 'node',
npm verb cli   '/Users/thomaspa/.nvm/v0.10.26/bin/npm',
npm verb cli   'search',
npm verb cli   'readme',
npm verb cli   '--verbose' ]
npm info using npm@1.4.9
npm info using node@v0.10.26
npm verb url raw /-/all/since?stale=update_after&startkey=1399564359560
npm verb url resolving [ 'http://registry.nodejitsu.com/',
npm verb url resolving   './-/all/since?stale=update_after&startkey=1399564359560' ]
npm verb url resolved http://registry.nodejitsu.com/-/all/since?stale=update_after&startkey=1399564359560
npm info trying registry request attempt 1 at 16:54:39
npm http GET http://registry.nodejitsu.com/-/all/since?stale=update_after&startkey=1399564359560
npm http 200 http://registry.nodejitsu.com/-/all/since?stale=update_after&startkey=1399564359560
FATAL ERROR: JS Allocation failed - process out of memory
Abort trap: 6
@othiym23 othiym23 added the bug label Nov 27, 2014

Similar issue, more recent version of npm and node.

[ec2-user@ip-172-30-1-7 npm_testing]$ npm search anything --verbose
npm info it worked if it ends with ok
npm verb cli [ '/home/ec2-user/.nvm/v0.10.33/bin/node',
npm verb cli   '/home/ec2-user/.nvm/v0.10.33/bin/npm',
npm verb cli   'search',
npm verb cli   'anything',
npm verb cli   '--verbose' ]
npm info using npm@1.4.28
npm info using node@v0.10.33
npm info get /home/ec2-user/.npm/registry.npmjs.org/-/all/.cache.json
npm WARN Building the local index for the first time, please be patient
npm verb request where is /-/all
npm verb request registry https://registry.npmjs.org/
npm verb request id 407c916b6d4a37ed
npm verb url raw /-/all
npm verb url resolving [ 'https://registry.npmjs.org/', './-/all' ]
npm verb url resolved https://registry.npmjs.org/-/all
npm verb request where is https://registry.npmjs.org/-/all
npm info trying registry request attempt 1 at 23:59:54
npm http GET https://registry.npmjs.org/-/all
npm http 200 https://registry.npmjs.org/-/all
FATAL ERROR: JS Allocation failed - process out of memory

Seeing the same issue as @ronaldpetty


Consistent out-of-memory when installing package with large json files

╭─{👘 } kumavis in ~/Development/Node/vapor on (master✱) 
╰─± npm i "git+https://github.com/ethereum/tests.git#develop"
npm WARN package.json vapor@0.0.0 No repository field.
FATAL ERROR: JS Allocation failed - process out of memory
[1]    40566 abort      npm i "git+https://github.com/ethereum/tests.git#develop"
╭─{👘 } kumavis in ~/Development/Node/vapor on (master✱) 
╰─± node -v
╭─{👘 } kumavis in ~/Development/Node/vapor on (master✱) 
╰─± npm -v

here is the more verbose log via -ddd


@kumavis npm@1.3.7 is a year and a half old at this point, so it probably needs an upgrade, but beyond that, if you're getting a V8 error like that on a plain install, something else is going wrong. What kind of system are you running on, and how much memory does it have?


Yeah I just realized that it was super old! Installs fine now:

╭─{👘 } kumavis in ~/Development/Node/vapor on (master✱) 
╰─± npm -v

OSX on 2012 MBP with 16 jigglebits of memeroo, so that shouldn't have been the problem


I faced the problem compressing a package with about 500MB of data (about 4GB free ram on my system atm). In my case it wasn't intentional but I guess the problem is still there for people genuinely compressing a package this large (it may be questionable for a npm package though).

Either a solution such as dstovell's or setting a size cap would resolve the issue.

@nicks nicks referenced this issue in Medium/phantomjs Feb 3, 2016

Core Dump when download already found #462


npm 1.x is deprecated, so I'm going to close this issue presuming that this problem hasn't arisen since. Thanks for your time everyone!


This is still a problem:

$ npm --version
$ npm search anything

<--- Last few GCs --->

   13159 ms: Scavenge 986.8 (1434.3) -> 986.8 (1434.3) MB, 6.7 / 0 ms [allocation failure].
   13165 ms: Scavenge 986.8 (1434.3) -> 986.8 (1434.3) MB, 6.4 / 0 ms [allocation failure].
   14052 ms: Mark-sweep 986.8 (1434.3) -> 986.2 (1434.3) MB, 886.9 / 0 ms (+ 597.2 ms in 3750 steps since start of marking, biggest step 5.6 ms) [last resort gc].
   14892 ms: Mark-sweep 986.2 (1434.3) -> 986.2 (1434.3) MB, 840.5 / 0 ms [last resort gc].

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x1c990ac9e59 <JS Object>
    1: stringify [native json.js:164] [pc=0x71d0d872117] (this=0x1c990ac36a9 <a JSON with map 0x126556c095e9>,z=0x6784f962329 <an Object with map 0xaacfdfdb61>,A=0x1c990a04189 <undefined>,N=0x1c990a04189 <undefined>)
    2: arguments adaptor frame: 1->3
    3: /* anonymous */ [/home/dsenn/.nvm/versions/node/v6.2.1/lib/node_modules/npm/lib/cache/update-index.js:96] [pc=0x71d0d85e2e1] (this=0x1c99...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
zsh: abort (core dumped)  npm search anything

I can reproduce this on several machines...


@topaxi you should take a look at #12619, which is the tracking issue for the generic case of npm search blowing up because the search index is too large for the current search implementation to handle. There's a whole set of related issues around memory exhaustion, though, and Emma is right that we no longer support npm@1.


@othiym23 alright, thanks :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment