On my Ubuntu virtual machine with 256Mb ram + 4Gb swap, npmjs 0.10.x:
npm search test
npm search test
WARN Building the local index for the first time, please be patient
info trying registry request attempt 1 at 09:25:40
http GET http://registry.npmjs.org/-/all
http 200 http://registry.npmjs.org/-/all
When I bump the memory on the virtual machine to 512Mb or higher, it works perfectly. There are no permission issues, and this is still an issue with the latest version from git. I've decided to just give my vm 512Mb, but this is an issue that should at least be documented.
Does it work if you set npm config set jobs 1?
npm config set jobs 1
I should have added that npm config set jobs 1 does help (it reduces the memory requirement by almost half). Without setting that, it needed close to a Gig of ram to avoid crashing. I forgot to mention that as it was one of the many things I tried when trying to get it to work.
Does it still die after that?
Yes. That setting reduces the required memory to an acceptable level, but npm still dies if system memory is less than about 512mb.
Trying to use node on a 128 MB RAM VPS and ran into the same problem. npm config set jobs 1 did not help.
As we start using more streams in npm memory requirement should be reduced, but I'm afraid there's no eta on that.
I'd like to bump this, using fresh VM on digital ocean's cheapest plan - 512RAM Ubuntu 12.04, I get killed even with jobs 1. I'd say this is should be a priority as there's no way to use npm.
For what it's worth, I seem to be able to use npm install without problems, it is only npm search that wont work. This doesn't seem critical to me as I can always read package descriptions on my laptop that I am remoting from.
I can only install local packages, but all the other ones it starts to perform first-time search
I had some luck working around this by copying ~/.npm/-/all/.cache.json from another machine.
Hit this too on a 512 MB VM, npm search is what's affected. Setting jobs 1 finally let me get the initial index, but I had to kill basically everything else to let it do that.
Not a huge deal to grab a fresh copy from an identical system that has more ram, as @bhyde suggests.
Another (sort of dangerous) option is to pop another terminal, find the PID in /proc as root and disable the OOM killer for that process (will thrash like crazy, but should work), or background the initial npm search. E.g.:
npm search foopackage &
echo "-17" > /proc/1234/oom_adj
Needs to be done as root, but will prevent the kernel from touching the process when it goes on its serial killer rampage looking for victims as memory is eaten up.
At your own risk, of course :)
I just installed the current version of nodejs, v0.10.24 which came with npm 1.3.21, and the issue is still occurring even though the issue was reported 5 months ago. Sad.
Loosely Related: #4429
I had the same issue and using a swap file worked for me as a workaround, on a 512Mb VPS.
(node v0.11.13-pre, npm 1.4.7)
^ Thanks for the link. Worked for me too.
The link worked for me as well. Thank you.
Swap files are only a workaround, I think npm should be able to manage its resources correctly using custom swap files for dumping
This happened to me also...my system had a 1GB of ram. I had to do npm config set jobs 1 and run it twice.
total used free shared buffers cached
Mem: 988 159 828 0 0 59
-/+ buffers/cache: 100 888
Swap: 0 0 0
We've been struggling with this issue for the past few days.
Me too, I had no problems with npm search until now.
Why does it have to build the index? Wouldn't it be much more user friendly if npm simply downloads the cache and use that? Just like apt.
Tried adding a 1gb swapfile on a DigitalOcean server with 512mb RAM, and the npm process was still killed. Upped it to a 2gb swapfile, and it finally worked without dying...
This is definitely a real issue, and we're discussing how to fix it over on #6016. If we want to preserve offline search, we're going to need a better solution than reading a file containing all of the registry's metadata in a single object literal.
@othiym23 issue closed?
This is a duplicate of #6016, more or less, so I'm closing it in favor of that issue. npm is a victim of its own success.
A 2gb swap file worked for me.
Just a note here to ping people with small-memory VMs - I have a candidate replacement for npm search in https://github.com/smikes/npm-kludge-search. It uses a streaming JSON parser so total startup memory is not so bad.
While building the index, total memory usage peaks at 750MB-1000MB, but the medium-term goal would be to build the index file centrally and distribute that. When I run the search on a previously-built index, max RSS usage is 180MB for a full table scan, ~50-80MB for an indexed lookup.
Please give it a try!
npm install -g npm-kludge-search
Created a recipe for Chef that always ran out of memory for me
GENIUS, Smikes!!! Fixed for me!!! Thanks!!!
I am still having the same issue. For me, it fails with and without npm config set jobs 1
node: v5.3.0, npm: 3.3.12
$ npm search <package-name>
npm WARN Building the local index for the first time, please be patient
node: v4.2.3, npm: 2.14.7
OpenVZ VPS config
$ free -m
total used free shared buffers cached
Mem: 4096 546 3549 10 0 313
-/+ buffers/cache: 232 3863
Swap: 4096 0 4096
Filesystem 1K-blocks Used Available Use% Mounted on
/dev/ploop38468p1 14989384 9256956 5102024 65% /
devtmpfs 2097152 0 2097152 0% /dev
tmpfs 2097152 0 2097152 0% /dev/shm
tmpfs 2097152 8320 2088832 1% /run
tmpfs 2097152 0 2097152 0% /sys/fs/cgroup
tmpfs 419432 0 419432 0% /run/user/0
$ cat /etc/redhat-release
CentOS Linux release 7.0.1406 (Core)
$ uname -a
Linux vm113 2.6.32-042stab108.5 #1 SMP Wed Jun 17 20:20:17 MSK 2015 x86_64 x86_64 x86_64 GNU/Linux