This repository has been archived by the owner. It is now read-only.

Speeding up `npm install` in docker #8836

Closed
olalonde opened this Issue Jul 6, 2015 · 42 comments

Comments

Projects
None yet
@olalonde

olalonde commented Jul 6, 2015

I'm using a Dockerfile.dev file for Node.js development and am using the following "trick" to only npm install when package.json is modified:

COPY package.json /src/
RUN npm install --unsafe-perm
COPY . /src

However, as I frequently install new modules and the npm install is quite slow, this is not good enough. I've tried mounting my local node_modules as a mounted volume but it seems npm install installs everything from scratch regardless of whether there are already installed modules in node_modules/. Is this correct? in this case, are there any flags or commands I can use to only npm install modules which aren't already present in node_modules/? Would mounting ./docker_npm_cache:/root/.npm in the Docker container be another suitable option?

@iarna iarna added the support label Jul 6, 2015

@vectart

This comment has been minimized.

Show comment
Hide comment
@vectart

vectart Jul 23, 2015

Got this problem, too

vectart commented Jul 23, 2015

Got this problem, too

@catherinetcai

This comment has been minimized.

Show comment
Hide comment
@catherinetcai

catherinetcai Aug 18, 2015

@olalonde, @vectart - npm install is very, very slow in Docker.

I found what helped was caching the npm install layer and pointing npm to a registry (I use a private repository, but I found just using npm registry sped things up too), so:

WORKDIR /tmp
ADD package.json /tmp/
RUN npm config set registry https://registry.npmjs.org/
RUN npm install
RUN cp -a /tmp/node_modules /app/

It's still rather slow, but instead of waiting 30+ minutes, it takes the time down to probably 10-15 minutes. If you're on OS X, I would download the Caffeine app, run it, and walk away.

catherinetcai commented Aug 18, 2015

@olalonde, @vectart - npm install is very, very slow in Docker.

I found what helped was caching the npm install layer and pointing npm to a registry (I use a private repository, but I found just using npm registry sped things up too), so:

WORKDIR /tmp
ADD package.json /tmp/
RUN npm config set registry https://registry.npmjs.org/
RUN npm install
RUN cp -a /tmp/node_modules /app/

It's still rather slow, but instead of waiting 30+ minutes, it takes the time down to probably 10-15 minutes. If you're on OS X, I would download the Caffeine app, run it, and walk away.

@nmccready

This comment has been minimized.

Show comment
Hide comment
@nmccready

nmccready Oct 22, 2015

Which version of npm are you using? Just double checking w 2.X vs 3.X as 3.X is much much slower.

nmccready commented Oct 22, 2015

Which version of npm are you using? Just double checking w 2.X vs 3.X as 3.X is much much slower.

@brownoxford

This comment has been minimized.

Show comment
Hide comment
@brownoxford

brownoxford Oct 22, 2015

I'm seeing the same issue being reported by @olalonde, I'm just using docker as a container run npm install as part of a build process for my app:

docker run --rm -v $(pwd):/app -w /app node:4.2.1 npm install

No matter what I do this keeps installing everything from scratch... This image provides npm v2.14.7 and node v4.2.1.

brownoxford commented Oct 22, 2015

I'm seeing the same issue being reported by @olalonde, I'm just using docker as a container run npm install as part of a build process for my app:

docker run --rm -v $(pwd):/app -w /app node:4.2.1 npm install

No matter what I do this keeps installing everything from scratch... This image provides npm v2.14.7 and node v4.2.1.

@olalonde

This comment has been minimized.

Show comment
Hide comment
@olalonde

olalonde Oct 23, 2015

I've been using this Dockerfile as a workaround for not always having to build everything from scratch as long as my package.json hasn't changed. It's still slow though and I ended up giving up on Docker for local Node.js development. I'm now only using Docker to run backend services like postgres or rabbitmq and run the Node.js code locally. I still provide the Dockerfile/docker-compose.yml in my repository to help people who don't have a local Node.js install but don't personally use it anymore.

FROM node:0.12

RUN mkdir -p /app
WORKDIR /app

ENV PATH=/app/node_modules/.bin:$PATH

# We add package.json first so that the  docker image build
# can use the cache as long as contents of package.json 
# hasn't changed.

COPY package.json /app/
RUN npm install --ignore-scripts --unsafe-perm

COPY . /app

CMD npm run postinstall && npm start
EXPOSE 5000

olalonde commented Oct 23, 2015

I've been using this Dockerfile as a workaround for not always having to build everything from scratch as long as my package.json hasn't changed. It's still slow though and I ended up giving up on Docker for local Node.js development. I'm now only using Docker to run backend services like postgres or rabbitmq and run the Node.js code locally. I still provide the Dockerfile/docker-compose.yml in my repository to help people who don't have a local Node.js install but don't personally use it anymore.

FROM node:0.12

RUN mkdir -p /app
WORKDIR /app

ENV PATH=/app/node_modules/.bin:$PATH

# We add package.json first so that the  docker image build
# can use the cache as long as contents of package.json 
# hasn't changed.

COPY package.json /app/
RUN npm install --ignore-scripts --unsafe-perm

COPY . /app

CMD npm run postinstall && npm start
EXPOSE 5000
@cancan101

This comment has been minimized.

Show comment
Hide comment
@cancan101

cancan101 Oct 29, 2015

I believe that I am seeing similar slowness (I am using docker-machine with Virtualbox). This might be related: http://qiita.com/neofreko/items/c36b3fd14dc77ab18a1a.

cancan101 commented Oct 29, 2015

I believe that I am seeing similar slowness (I am using docker-machine with Virtualbox). This might be related: http://qiita.com/neofreko/items/c36b3fd14dc77ab18a1a.

@withinboredom

This comment has been minimized.

Show comment
Hide comment
@withinboredom

withinboredom Nov 9, 2015

It has to do with how docker uses shared volumes. The disk volume is extremely slow. I haven't tried copying the package.json to a directory elsewhere and doing the npm install there, then copying the node_modules back. I'll report here after I do.

withinboredom commented Nov 9, 2015

It has to do with how docker uses shared volumes. The disk volume is extremely slow. I haven't tried copying the package.json to a directory elsewhere and doing the npm install there, then copying the node_modules back. I'll report here after I do.

@NickClark

This comment has been minimized.

Show comment
Hide comment
@NickClark

NickClark Nov 20, 2015

I've reverted to just using a symbolic link to my dependencies instead of copying them.

NickClark commented Nov 20, 2015

I've reverted to just using a symbolic link to my dependencies instead of copying them.

@max-mykhailenko

This comment has been minimized.

Show comment
Hide comment
@max-mykhailenko

max-mykhailenko Jan 19, 2016

Nobody use docker for npm? With npm installtime around 30+ minutes all docker stuff should burn in hell :) How I can improve performance and why it's so slow?

Hm, it's not npm problem only. When I try to update ubuntu docker it's very slow too.

max-mykhailenko commented Jan 19, 2016

Nobody use docker for npm? With npm installtime around 30+ minutes all docker stuff should burn in hell :) How I can improve performance and why it's so slow?

Hm, it's not npm problem only. When I try to update ubuntu docker it's very slow too.

@Andyccs

This comment has been minimized.

Show comment
Hide comment
@Andyccs

Andyccs Jan 19, 2016

Hi @max-mykhailenko , currently I found out that solution by @blackjackcf above is the best work around.

Andyccs commented Jan 19, 2016

Hi @max-mykhailenko , currently I found out that solution by @blackjackcf above is the best work around.

@mariohmol

This comment has been minimized.

Show comment
Hide comment
@mariohmol

mariohmol Jan 21, 2016

I have same problem! Using @blackjackcf helped a little, but is quite the same. For a building in other environments (stagin.. productio) its ok to install all again. but for local development is crazy!

What I did was to remove from .dockerignore the node_modules and make a ADD node_modules directly to /opt/app/node_modules.

Does anyone have an idea to cache all modules in container ?

mariohmol commented Jan 21, 2016

I have same problem! Using @blackjackcf helped a little, but is quite the same. For a building in other environments (stagin.. productio) its ok to install all again. but for local development is crazy!

What I did was to remove from .dockerignore the node_modules and make a ADD node_modules directly to /opt/app/node_modules.

Does anyone have an idea to cache all modules in container ?

@Misterhex

This comment has been minimized.

Show comment
Hide comment
@Misterhex

Misterhex Jan 23, 2016

+1 out of the box yeoman angular-fullstack take over 30 minutes to install..

Misterhex commented Jan 23, 2016

+1 out of the box yeoman angular-fullstack take over 30 minutes to install..

@mariohmol

This comment has been minimized.

Show comment
Hide comment
@mariohmol

mariohmol Jan 23, 2016

What we did that made a LOT of difference is to change the network configuration in the docker VM. You have to change from NAT to Bridge and this happens just in MAC. Now i can make all installs in 5 minutes.

mariohmol commented Jan 23, 2016

What we did that made a LOT of difference is to change the network configuration in the docker VM. You have to change from NAT to Bridge and this happens just in MAC. Now i can make all installs in 5 minutes.

@vjpr

This comment has been minimized.

Show comment
Hide comment
@vjpr

vjpr Jan 28, 2016

My node_modules is 400MB. An npm install from scratch takes a life time to finish (30mins+).

If you really want to speed up the npm install during the Dockerfile build process, you can copy your existing node_modules from your host machine to your Docker image over ssh.

This allows you to build a node_modules from scratch natively (or use your existing dev node_modules) on your host which is fast-er.

So basically you never have to wait for an unneccesary npm install anymore. This is useful for when you want to debug while preparing a production Dockerfile for your CI, or gold-master build process.

If I used Docker for dev (I prefer to develop natively) I would use a docker-machine-nfs and run npm install on a running container using an NFS volume.


Speed

NOTE: This is still quite slow for me on OSX.

Timing each command separately:

  • [osx] tar xc compress = 25s for ~400MB
  • [osx] ssh copying = 3s for ~100MB
  • [docker container] tar xf extract = 20s

Total time to copy 400MB node_modules = 2m 40s.

So it should be taking ~1min. I suspect it might be the way I pipe over ssh instead of just copying a single file.

If everything is up to date npm install (@2.x) takes 35s with my particular tree. On OSX it takes 28s.


What about just removing node_modules from the .dockerignore?

With my ssh approach you only have to copy the new node_modules when your package.json changes. Copying the initial project context should be fast because it runs every time you run build (no caching). For me this was a no-go.

And another benefit is you are in complete control of how your node_modules are copied - I'm sure there are some optimizations such as using rsync, maintaining a pre-zipped node_modules, etc.


Here is the code:

RUN mkdir -p /var/www/current
WORKDIR /var/www/current

COPY package.json .

ARG HOST_SSH="user@your-host-ip"
ARG HOST_ROOT="full-path-to-your-source-code-on-host"
ARG HOST_SSH_PASSWORD="ssh-password"

RUN apt-get install -y sshpass
RUN set -x && sshpass -p $HOST_SSH_PASSWORD ssh -o StrictHostKeyChecking=no $HOST_SSH "tar -zc -C $HOST_ROOT node_modules" | tar -zx --verbose -C /var/www/current/
# Remove `npm link`ed dependencies.
RUN find node_modules -maxdepth 1 -type l -exec rm -f {} \;

# Takes 1 minute.
RUN npm install

# Takes another minute.
RUN npm rebuild

# Copy rest of your source code.
COPY . .

And a script to run it. NOTE: You must run this script from your project root.

#!/bin/sh

set -x

BASEDIR=$(dirname $0)

docker build \
--file $BASEDIR/Dockerfile \
--tag foo \
--build-arg HOST_SSH=xxx HOST_ROOT=xxx HOST_SSH_PASSWORD=xxx \
.

Why SSH?

SSH is easy to setup on most machines. On OSX, goto Settings > Sharing > Remote Login.

We could also use HTTP which would mean we could use ADD. E.g. ADD http://192.168.0.111/node_modules.tar.xz /node_modules. I wonder if it would be faster?

http://stackoverflow.com/questions/22907231/copying-files-from-host-to-docker-container


COMMIT

I just came across the COMMIT command and it looks promising. Basically run npm install outside your Dockerfile so you can use shared volumes. Mount an NFS VOLUME to your source code, start a container from your image, copy node_modules from the shared volume, then run npm install, then commit the changes. Haven't tried it yet thought.


ADD

If you node_modules are small enough you could tar it before running build, add to your build context, then use ADD node_modules.tar.xz.

vjpr commented Jan 28, 2016

My node_modules is 400MB. An npm install from scratch takes a life time to finish (30mins+).

If you really want to speed up the npm install during the Dockerfile build process, you can copy your existing node_modules from your host machine to your Docker image over ssh.

This allows you to build a node_modules from scratch natively (or use your existing dev node_modules) on your host which is fast-er.

So basically you never have to wait for an unneccesary npm install anymore. This is useful for when you want to debug while preparing a production Dockerfile for your CI, or gold-master build process.

If I used Docker for dev (I prefer to develop natively) I would use a docker-machine-nfs and run npm install on a running container using an NFS volume.


Speed

NOTE: This is still quite slow for me on OSX.

Timing each command separately:

  • [osx] tar xc compress = 25s for ~400MB
  • [osx] ssh copying = 3s for ~100MB
  • [docker container] tar xf extract = 20s

Total time to copy 400MB node_modules = 2m 40s.

So it should be taking ~1min. I suspect it might be the way I pipe over ssh instead of just copying a single file.

If everything is up to date npm install (@2.x) takes 35s with my particular tree. On OSX it takes 28s.


What about just removing node_modules from the .dockerignore?

With my ssh approach you only have to copy the new node_modules when your package.json changes. Copying the initial project context should be fast because it runs every time you run build (no caching). For me this was a no-go.

And another benefit is you are in complete control of how your node_modules are copied - I'm sure there are some optimizations such as using rsync, maintaining a pre-zipped node_modules, etc.


Here is the code:

RUN mkdir -p /var/www/current
WORKDIR /var/www/current

COPY package.json .

ARG HOST_SSH="user@your-host-ip"
ARG HOST_ROOT="full-path-to-your-source-code-on-host"
ARG HOST_SSH_PASSWORD="ssh-password"

RUN apt-get install -y sshpass
RUN set -x && sshpass -p $HOST_SSH_PASSWORD ssh -o StrictHostKeyChecking=no $HOST_SSH "tar -zc -C $HOST_ROOT node_modules" | tar -zx --verbose -C /var/www/current/
# Remove `npm link`ed dependencies.
RUN find node_modules -maxdepth 1 -type l -exec rm -f {} \;

# Takes 1 minute.
RUN npm install

# Takes another minute.
RUN npm rebuild

# Copy rest of your source code.
COPY . .

And a script to run it. NOTE: You must run this script from your project root.

#!/bin/sh

set -x

BASEDIR=$(dirname $0)

docker build \
--file $BASEDIR/Dockerfile \
--tag foo \
--build-arg HOST_SSH=xxx HOST_ROOT=xxx HOST_SSH_PASSWORD=xxx \
.

Why SSH?

SSH is easy to setup on most machines. On OSX, goto Settings > Sharing > Remote Login.

We could also use HTTP which would mean we could use ADD. E.g. ADD http://192.168.0.111/node_modules.tar.xz /node_modules. I wonder if it would be faster?

http://stackoverflow.com/questions/22907231/copying-files-from-host-to-docker-container


COMMIT

I just came across the COMMIT command and it looks promising. Basically run npm install outside your Dockerfile so you can use shared volumes. Mount an NFS VOLUME to your source code, start a container from your image, copy node_modules from the shared volume, then run npm install, then commit the changes. Haven't tried it yet thought.


ADD

If you node_modules are small enough you could tar it before running build, add to your build context, then use ADD node_modules.tar.xz.

@catherinetcai

This comment has been minimized.

Show comment
Hide comment
@catherinetcai

catherinetcai Jan 29, 2016

Another thing that might potentially alleviate the speed is this related issue: #11283 (comment). Apparently, the progress bar in npm 3 significantly slows down the npm install process. Turning it off might give it another bit of a speed boost.

catherinetcai commented Jan 29, 2016

Another thing that might potentially alleviate the speed is this related issue: #11283 (comment). Apparently, the progress bar in npm 3 significantly slows down the npm install process. Turning it off might give it another bit of a speed boost.

@mariohmol

This comment has been minimized.

Show comment
Hide comment
@mariohmol

mariohmol Jan 29, 2016

My problem was a network problem, some connections was not closing and npm opens thousands of simultaneos connections.. if you check using nettop you can see it.

The problem with @vjpr is if you use OSX as dev machine and linux as VM, you cant just copy you have to compile again

mariohmol commented Jan 29, 2016

My problem was a network problem, some connections was not closing and npm opens thousands of simultaneos connections.. if you check using nettop you can see it.

The problem with @vjpr is if you use OSX as dev machine and linux as VM, you cant just copy you have to compile again

@vjpr

This comment has been minimized.

Show comment
Hide comment
@vjpr

vjpr Jan 29, 2016

@mariohmol You can. You just have to run npm rebuild to rebuild the native deps as it does in the script.

vjpr commented Jan 29, 2016

@mariohmol You can. You just have to run npm rebuild to rebuild the native deps as it does in the script.

@mariohmol

This comment has been minimized.

Show comment
Hide comment
@mariohmol

mariohmol Jan 29, 2016

True, in my case that is a problem with downloading.. this is a good solution!! =)

mariohmol commented Jan 29, 2016

True, in my case that is a problem with downloading.. this is a good solution!! =)

@vjpr

This comment has been minimized.

Show comment
Hide comment
@vjpr

vjpr Jan 30, 2016

I have found a new approach...

Create a data volume container.
docker create -v /tmp/app --name foo-app--node-modules ubuntu
Mount your package.json file inside a temporary container one dir below your data volume container mount location
docker run \
--rm \ 
--volumes-from=foo-app--node-modules \
-v $PWD/package.json:/tmp/app/package.json:ro \
node:4.2.6 \
/bin/bash -c "cd /tmp/app/; npm install"

E.g. The file structure inside the temporary vm will look like this:-

- /tmp/app
  - package.json  <- host file from project context mount
  - /node_modules <- data volume container mount point (set during `docker create -v ...`)
Build your Node.js Docker image.

.dockerignore

node_modules
.git
etc.

Dockerfile

FROM node:4.2.6
WORKDIR /app
COPY . .
RUN ln -s /tmp/app/node_modules node_modules

Alternatively just mount your project root to /app, and then add the symlink in a command, to avoid having to rebuild.

docker build -t foo-app
Run with data volume container.
docker run --volumes-from=foo-app--node-modules foo-app /bin/bash -c "node index.js"

You can also preload your node_modules from your local dev directory before running npm install for the first time by mounting your project root in the 2nd step.

docker run \
--rm \ 
--volumes-from=foo-app--node-modules \
-v $PWD/package.json:/tmp/app/package.json:ro \
-v $PWD/.:/tmp/app-host:ro \
node:4.2.6 \
/bin/bash -c "cp -R /tmp/app-host/node_modules /tmp/app; cd /tmp/app/; npm install"

vjpr commented Jan 30, 2016

I have found a new approach...

Create a data volume container.
docker create -v /tmp/app --name foo-app--node-modules ubuntu
Mount your package.json file inside a temporary container one dir below your data volume container mount location
docker run \
--rm \ 
--volumes-from=foo-app--node-modules \
-v $PWD/package.json:/tmp/app/package.json:ro \
node:4.2.6 \
/bin/bash -c "cd /tmp/app/; npm install"

E.g. The file structure inside the temporary vm will look like this:-

- /tmp/app
  - package.json  <- host file from project context mount
  - /node_modules <- data volume container mount point (set during `docker create -v ...`)
Build your Node.js Docker image.

.dockerignore

node_modules
.git
etc.

Dockerfile

FROM node:4.2.6
WORKDIR /app
COPY . .
RUN ln -s /tmp/app/node_modules node_modules

Alternatively just mount your project root to /app, and then add the symlink in a command, to avoid having to rebuild.

docker build -t foo-app
Run with data volume container.
docker run --volumes-from=foo-app--node-modules foo-app /bin/bash -c "node index.js"

You can also preload your node_modules from your local dev directory before running npm install for the first time by mounting your project root in the 2nd step.

docker run \
--rm \ 
--volumes-from=foo-app--node-modules \
-v $PWD/package.json:/tmp/app/package.json:ro \
-v $PWD/.:/tmp/app-host:ro \
node:4.2.6 \
/bin/bash -c "cp -R /tmp/app-host/node_modules /tmp/app; cd /tmp/app/; npm install"
@bpanahij

This comment has been minimized.

Show comment
Hide comment
@bpanahij

bpanahij Jan 31, 2016

What I've done in the past is to have two Dockerfiles:

The first Dockerfile builds an image with just the package.json copied to /src/package.json, and an npm install in src. That creates a node_modules directory in /src.

Then I base a second Dockerfile from the first image, and copy the source files, to a subdirectory of /src, for instance /src/my-app/

Dockerfile:
FROM first-image
...

I only rebuild the first image when I change dependencies.

bpanahij commented Jan 31, 2016

What I've done in the past is to have two Dockerfiles:

The first Dockerfile builds an image with just the package.json copied to /src/package.json, and an npm install in src. That creates a node_modules directory in /src.

Then I base a second Dockerfile from the first image, and copy the source files, to a subdirectory of /src, for instance /src/my-app/

Dockerfile:
FROM first-image
...

I only rebuild the first image when I change dependencies.

@imdevin567

This comment has been minimized.

Show comment
Hide comment
@imdevin567

imdevin567 Feb 8, 2016

I was having this issue with the official node images from Docker, so I switched to an image based off of Alpine Linux. Amazingly, my npm installs are now at least 6x faster than what they used to be. The major difference between the two is logging: the Alpine-based image only logs warnings, whereas the official image has verbose logging. This may be related to what @blackjackcf mentioned above.

In any case, I would recommend turning off logging and seeing if that speeds up the installs.

imdevin567 commented Feb 8, 2016

I was having this issue with the official node images from Docker, so I switched to an image based off of Alpine Linux. Amazingly, my npm installs are now at least 6x faster than what they used to be. The major difference between the two is logging: the Alpine-based image only logs warnings, whereas the official image has verbose logging. This may be related to what @blackjackcf mentioned above.

In any case, I would recommend turning off logging and seeing if that speeds up the installs.

@rauchg

This comment has been minimized.

Show comment
Hide comment
@rauchg

rauchg Feb 18, 2016

@imdevin567 I find the same thing. +1 on this tip.

rauchg commented Feb 18, 2016

@imdevin567 I find the same thing. +1 on this tip.

@mariuslundgard

This comment has been minimized.

Show comment
Hide comment
@mariuslundgard

mariuslundgard May 8, 2016

Was there really a reason to close this issue? If so, what was it?

mariuslundgard commented May 8, 2016

Was there really a reason to close this issue? If so, what was it?

@Andyccs

This comment has been minimized.

Show comment
Hide comment
@Andyccs

Andyccs May 8, 2016

@mariuslundgard I am not in charge of this issue, but I have been following this for a while now. I have no idea why, but after switching to nodejs 5 or 6, I don't have such issue anymore.

Andyccs commented May 8, 2016

@mariuslundgard I am not in charge of this issue, but I have been following this for a while now. I have no idea why, but after switching to nodejs 5 or 6, I don't have such issue anymore.

@hpurmann

This comment has been minimized.

Show comment
Hide comment
@hpurmann

hpurmann May 13, 2016

@mariuslundgard It's not closed. The above notice is just a reference to another issue which is closed.

hpurmann commented May 13, 2016

@mariuslundgard It's not closed. The above notice is just a reference to another issue which is closed.

@Dev-Dipesh

This comment has been minimized.

Show comment
Hide comment
@Dev-Dipesh

Dev-Dipesh May 24, 2016

This is what I'm doing -

FROM node:4.4.4

# use changes to package.json to force Docker not to use the cache
# when we change our application's nodejs dependencies:
ADD package.json /tmp/package.json

RUN cd /tmp && npm i --production --ignore-scripts --unsafe-perm
RUN cd /tmp && npm i --only=dev --ignore-scripts --unsafe-perm
RUN npm i -g pm2 nodemon gulp gulp-cli

RUN mkdir -p /opt/app && cp -a /tmp/node_modules /opt/app/

# From here we load our application's code in, therefore the previous docker
# "layer" thats been cached will be used if possible
WORKDIR /opt/app

ADD . /opt/app

RUN chmod +x /opt/app/configure-node.sh

EXPOSE 3000

CMD ["/opt/app/configure-node.sh"]

Separate installation of dev & prod helps me escape from out of space problems, else I've to create swap and do a bunch of fixes.

--only=dev ask npm to not install all devDependencies recursively - #5554

Using /tmp helps in getting the benefit of Docker layers and caching - http://bitjudo.com/blog/2014/03/13/building-efficient-dockerfiles-node-dot-js/

NOTE: I've not set ENV variable yet in this file, which you can set and accordingly install only respective dependencies.

Dev-Dipesh commented May 24, 2016

This is what I'm doing -

FROM node:4.4.4

# use changes to package.json to force Docker not to use the cache
# when we change our application's nodejs dependencies:
ADD package.json /tmp/package.json

RUN cd /tmp && npm i --production --ignore-scripts --unsafe-perm
RUN cd /tmp && npm i --only=dev --ignore-scripts --unsafe-perm
RUN npm i -g pm2 nodemon gulp gulp-cli

RUN mkdir -p /opt/app && cp -a /tmp/node_modules /opt/app/

# From here we load our application's code in, therefore the previous docker
# "layer" thats been cached will be used if possible
WORKDIR /opt/app

ADD . /opt/app

RUN chmod +x /opt/app/configure-node.sh

EXPOSE 3000

CMD ["/opt/app/configure-node.sh"]

Separate installation of dev & prod helps me escape from out of space problems, else I've to create swap and do a bunch of fixes.

--only=dev ask npm to not install all devDependencies recursively - #5554

Using /tmp helps in getting the benefit of Docker layers and caching - http://bitjudo.com/blog/2014/03/13/building-efficient-dockerfiles-node-dot-js/

NOTE: I've not set ENV variable yet in this file, which you can set and accordingly install only respective dependencies.

@arypurnomoz

This comment has been minimized.

Show comment
Hide comment
@arypurnomoz

arypurnomoz Jun 13, 2016

how about using the --cache-min option?

arypurnomoz commented Jun 13, 2016

how about using the --cache-min option?

@chippawah

This comment has been minimized.

Show comment
Hide comment
@chippawah

chippawah Jun 17, 2016

Is there really not an official solution nearly a year after opening this issue? It seems like there's about to be a mass flood to Docker and a really poor dev experience if you want to use node.

chippawah commented Jun 17, 2016

Is there really not an official solution nearly a year after opening this issue? It seems like there's about to be a mass flood to Docker and a really poor dev experience if you want to use node.

@olalonde

This comment has been minimized.

Show comment
Hide comment
@olalonde

olalonde Jun 18, 2016

@chippawah as the guy who opened this issue, the problem is largely "solved" for me... I run my node code locally and only use docker for things like databases, etc. It would be cool if I could run my node code in docker as well but yeah, it still is a poor experience. I use docker + node in production but the build is handled by http://deis.io/

olalonde commented Jun 18, 2016

@chippawah as the guy who opened this issue, the problem is largely "solved" for me... I run my node code locally and only use docker for things like databases, etc. It would be cool if I could run my node code in docker as well but yeah, it still is a poor experience. I use docker + node in production but the build is handled by http://deis.io/

@puppybits

This comment has been minimized.

Show comment
Hide comment
@puppybits

puppybits Jun 20, 2016

I run a pre-build script to keep docker's cache working as expected. This will skip npm install until the dependencies change. My build times for docker are also < 10s (unless I add/remove a dependency).

package.json

"build": "node -e \"require('fs').writeFileSync('.deps.json',JSON.stringify({dependencies:require('./package.json').dependencies}))\" && docker build -t my/image .",

Dockerfile

# .deps.json only has the dependencies from the package.json.
COPY .deps.json /app/package.json
# Docker will use the cache until the dependencies in the package.json have changed
RUN npm install

# this next line will bust the cache when anything changes but the copies are quick
COPY package.json /app/package.json
COPY src /app/src
CMD npm start

If you naively copy the entire package.json when ever you increment the version of your project it will bust docker's cache. Copying only the dependencies will keep docker's cache until there is a dependency change which requires busting the cache.

puppybits commented Jun 20, 2016

I run a pre-build script to keep docker's cache working as expected. This will skip npm install until the dependencies change. My build times for docker are also < 10s (unless I add/remove a dependency).

package.json

"build": "node -e \"require('fs').writeFileSync('.deps.json',JSON.stringify({dependencies:require('./package.json').dependencies}))\" && docker build -t my/image .",

Dockerfile

# .deps.json only has the dependencies from the package.json.
COPY .deps.json /app/package.json
# Docker will use the cache until the dependencies in the package.json have changed
RUN npm install

# this next line will bust the cache when anything changes but the copies are quick
COPY package.json /app/package.json
COPY src /app/src
CMD npm start

If you naively copy the entire package.json when ever you increment the version of your project it will bust docker's cache. Copying only the dependencies will keep docker's cache until there is a dependency change which requires busting the cache.

@krukid

This comment has been minimized.

Show comment
Hide comment
@krukid

krukid Jul 21, 2016

[EDIT] This is essentially the same thing @puppybits wrote, I just got freaked out by the "build" command and didn't read till the end :D So yeah, leaving this here because I spent 5 minutes writing it. Sorry.

The simplest thing you can do to speed up your builds is double-cache your npm modules, i.e.:

# ...

# install from shadow .package.json (notice the dot)
ADD .package.json /tmp/package.json
RUN cd /tmp && npm install

# install from actual package.json
ADD package.json /tmp/package.json
RUN cd /tmp && npm install

# ...

Then, whenever your build gets slow again, just cp package.json .package.json and rebuild the whole thing once.

[PS] I had about 10 minute builds before using this approach, now most of the time I have ~1 minute builds, coz what I was mostly doing with my package.json prior to building an image is bumping versions. Also, now I know people don't bump their version when changing dependencies. Or their cache doesn't work.

krukid commented Jul 21, 2016

[EDIT] This is essentially the same thing @puppybits wrote, I just got freaked out by the "build" command and didn't read till the end :D So yeah, leaving this here because I spent 5 minutes writing it. Sorry.

The simplest thing you can do to speed up your builds is double-cache your npm modules, i.e.:

# ...

# install from shadow .package.json (notice the dot)
ADD .package.json /tmp/package.json
RUN cd /tmp && npm install

# install from actual package.json
ADD package.json /tmp/package.json
RUN cd /tmp && npm install

# ...

Then, whenever your build gets slow again, just cp package.json .package.json and rebuild the whole thing once.

[PS] I had about 10 minute builds before using this approach, now most of the time I have ~1 minute builds, coz what I was mostly doing with my package.json prior to building an image is bumping versions. Also, now I know people don't bump their version when changing dependencies. Or their cache doesn't work.

@rlabrecque

This comment has been minimized.

Show comment
Hide comment
@rlabrecque

rlabrecque Aug 26, 2016

So my workaround for this is to just do this all at runtime, not build time. I actually ditched my Dockerfile completely and just went with docker-compose with a huge command: that does everything my Dockerfile did. Start up is a little slower obviously, and I probably can't use run anymore but this works well for my needs.
Definitely not for everyone though.

rlabrecque commented Aug 26, 2016

So my workaround for this is to just do this all at runtime, not build time. I actually ditched my Dockerfile completely and just went with docker-compose with a huge command: that does everything my Dockerfile did. Start up is a little slower obviously, and I probably can't use run anymore but this works well for my needs.
Definitely not for everyone though.

@steinbachr

This comment has been minimized.

Show comment
Hide comment
@steinbachr

steinbachr Sep 8, 2016

also had this problem. For me, updating the image to use the most recent version of Node (v 6.x) yielded an immediate speedup

steinbachr commented Sep 8, 2016

also had this problem. For me, updating the image to use the most recent version of Node (v 6.x) yielded an immediate speedup

@agilgur5

This comment has been minimized.

Show comment
Hide comment
@agilgur5

agilgur5 Sep 21, 2016

In case anyone hasn't heard of it, Rockerfiles solve this problem with their MOUNT directive, which lets you mount a volume during your Docker build (in this case, your previous node_modules volume). It has a ton of other useful directives to deal with common problems in Docker as well.

agilgur5 commented Sep 21, 2016

In case anyone hasn't heard of it, Rockerfiles solve this problem with their MOUNT directive, which lets you mount a volume during your Docker build (in this case, your previous node_modules volume). It has a ton of other useful directives to deal with common problems in Docker as well.

@sanmai-NL

This comment has been minimized.

Show comment
Hide comment
@sanmai-NL

sanmai-NL Sep 30, 2016

You also use Packer, this builds images inside a container so you can mount volumes during build as well.

sanmai-NL commented Sep 30, 2016

You also use Packer, this builds images inside a container so you can mount volumes during build as well.

@pallickal

This comment has been minimized.

Show comment
Hide comment
@pallickal

pallickal Oct 14, 2016

I've managed a solution using Rocker and Yarn. Rocker allows mounting files and directories temporarily, just during build, and Yarn is a new npm alternative that has a cache directory and an offline mode.

The trick is using Rocker to mount a persistent yarn-cache sub-directory from the project. yarn.lock also needs to be mounted from the project's root in order for offline caching to work.

My goal was to be able to work totally offline, so I've also cached a download of Yarn in my project folder rather than relying on accessing the unofficial Debian repository through apt-get in the Rockerfile.

First, in my package.json scripts section -

  "scripts": {
    "update-dev-cache": "mkdir -p dev-cache  && touch yarn.lock && wget -O dev-cache/yarn-latest.tar.gz https://yarnpkg.com/latest.tar.gz && mkdir -p dev-cache/yarn-latest && tar zvxf dev-cache/yarn-latest.tar.gz -C dev-cache/yarn-latest --strip-components=1 && test -e dev-cache/yarn-latest/bin/yarn",
    "clear-dev-cache" : "sudo rm -rf dev-cache",
    "prebuild": "[ -d dev-cache ] && [ -w dev-cache ] || npm run update-dev-cache",
    "build": "rocker build  --build-arg YARN_OFFLINE=$DOCKER_YARN_OFFLINE"
  }

The less obvious parts of update-dev-cache are -

  1. It creates a blank yarn.lock if none exists, so that Rocker does not create a directory called yarn.lock during mount.
  2. It downloads the latest Yarn tarball into the dev-cache folder, and unpacks it into a consistent folder name so it can be copied during build.

The prebuild script tries to initialize the dev-cache using update-dev-cache unless the directory exists and is writable. The update-dev-cache script will fail if the directory is not writable, making the source of the error obvious.

A non-writable dev-cache can occur if rocker build is run without an existing dev-cache directory, as the MOUNT instruction will cause one to be created, only owned by root. The prebuild script prevents this by ensuring the dev-cache is initialized before attempting to build.

The clear-dev-cache script requires sudo because although dev-cache and dev-cache/yarn-latest will be owned by the current user, dev-cache/yarn-cache and everything inside it will be owned by root. A consequence of being mounted and written to by the temporary container used to build the image.

Here are the relevant parts of my Rockerfile -

FROM node:latest

ARG YARN_OFFLINE

# Create app directory
RUN mkdir -p /usr/src/my-app-name
WORKDIR /usr/src/my-app-name

# Install yarn from cached copy
COPY ./dev-cache/yarn-latest/ /opt/yarn-latest/
RUN ln -s /opt/yarn-latest/bin/yarn /bin/

# Mount app dependency cache
# yarn.lock must already exist as a file, it can be empty
MOUNT ./yarn.lock:/usr/src/my-app-name/yarn.lock
MOUNT ./dev-cache/yarn-cache:/root/.yarn-cache

# Install app dependencies
COPY package.json ./package.json
RUN yarn ${YARN_OFFLINE:+--offline}

To work offline I run export DOCKER_YARN_OFFLINE=true in the shell before running the build script in my package.json. ${YARN_OFFLINE:+--offline} will only inject the --offline option to yarn if the YARN_OFFLINE build-arg is set to a non-empty value. The DOCKER_YARN_OFFLINE environment variable is passed in as this build-arg, in the build script. This way I don't have to edit package.json every time I want to work offline.

There is also a --prefer-offline option for yarn that you may want to introduce as a separate build-arg. I don't know how it behaves yet and documention is sparse. Ideally it should only download when a package changes, and not introduce much delay when offline. If it works well enough, you could just use this option on every call to yarn and dispense with the passing of environment variables.

One caveat is that Yarn is still very new and may have bugs. For example I still have to use npm run because of a bug that is not allowing a few of my scripts to run through Yarn. A fix has been committed but has not propagated to an official release as of this writing. The plus side is you can keep using npm where needed without much hassle, as Yarn takes settings from the standard npm config files.

pallickal commented Oct 14, 2016

I've managed a solution using Rocker and Yarn. Rocker allows mounting files and directories temporarily, just during build, and Yarn is a new npm alternative that has a cache directory and an offline mode.

The trick is using Rocker to mount a persistent yarn-cache sub-directory from the project. yarn.lock also needs to be mounted from the project's root in order for offline caching to work.

My goal was to be able to work totally offline, so I've also cached a download of Yarn in my project folder rather than relying on accessing the unofficial Debian repository through apt-get in the Rockerfile.

First, in my package.json scripts section -

  "scripts": {
    "update-dev-cache": "mkdir -p dev-cache  && touch yarn.lock && wget -O dev-cache/yarn-latest.tar.gz https://yarnpkg.com/latest.tar.gz && mkdir -p dev-cache/yarn-latest && tar zvxf dev-cache/yarn-latest.tar.gz -C dev-cache/yarn-latest --strip-components=1 && test -e dev-cache/yarn-latest/bin/yarn",
    "clear-dev-cache" : "sudo rm -rf dev-cache",
    "prebuild": "[ -d dev-cache ] && [ -w dev-cache ] || npm run update-dev-cache",
    "build": "rocker build  --build-arg YARN_OFFLINE=$DOCKER_YARN_OFFLINE"
  }

The less obvious parts of update-dev-cache are -

  1. It creates a blank yarn.lock if none exists, so that Rocker does not create a directory called yarn.lock during mount.
  2. It downloads the latest Yarn tarball into the dev-cache folder, and unpacks it into a consistent folder name so it can be copied during build.

The prebuild script tries to initialize the dev-cache using update-dev-cache unless the directory exists and is writable. The update-dev-cache script will fail if the directory is not writable, making the source of the error obvious.

A non-writable dev-cache can occur if rocker build is run without an existing dev-cache directory, as the MOUNT instruction will cause one to be created, only owned by root. The prebuild script prevents this by ensuring the dev-cache is initialized before attempting to build.

The clear-dev-cache script requires sudo because although dev-cache and dev-cache/yarn-latest will be owned by the current user, dev-cache/yarn-cache and everything inside it will be owned by root. A consequence of being mounted and written to by the temporary container used to build the image.

Here are the relevant parts of my Rockerfile -

FROM node:latest

ARG YARN_OFFLINE

# Create app directory
RUN mkdir -p /usr/src/my-app-name
WORKDIR /usr/src/my-app-name

# Install yarn from cached copy
COPY ./dev-cache/yarn-latest/ /opt/yarn-latest/
RUN ln -s /opt/yarn-latest/bin/yarn /bin/

# Mount app dependency cache
# yarn.lock must already exist as a file, it can be empty
MOUNT ./yarn.lock:/usr/src/my-app-name/yarn.lock
MOUNT ./dev-cache/yarn-cache:/root/.yarn-cache

# Install app dependencies
COPY package.json ./package.json
RUN yarn ${YARN_OFFLINE:+--offline}

To work offline I run export DOCKER_YARN_OFFLINE=true in the shell before running the build script in my package.json. ${YARN_OFFLINE:+--offline} will only inject the --offline option to yarn if the YARN_OFFLINE build-arg is set to a non-empty value. The DOCKER_YARN_OFFLINE environment variable is passed in as this build-arg, in the build script. This way I don't have to edit package.json every time I want to work offline.

There is also a --prefer-offline option for yarn that you may want to introduce as a separate build-arg. I don't know how it behaves yet and documention is sparse. Ideally it should only download when a package changes, and not introduce much delay when offline. If it works well enough, you could just use this option on every call to yarn and dispense with the passing of environment variables.

One caveat is that Yarn is still very new and may have bugs. For example I still have to use npm run because of a bug that is not allowing a few of my scripts to run through Yarn. A fix has been committed but has not propagated to an official release as of this writing. The plus side is you can keep using npm where needed without much hassle, as Yarn takes settings from the standard npm config files.

@olalonde

This comment has been minimized.

Show comment
Hide comment
@olalonde

olalonde Oct 21, 2016

I just came up with this trick to speed up npm installs in docker (also plays well with CI/CD servers that don't cache image layers):

Dockerfile

# vim:set ft=dockerfile:
FROM node:6

# Create app directory
RUN mkdir -p /app
WORKDIR /app

# Create .npmrc (required for private registry)
RUN echo '//registry.npmjs.org/:_authToken=${NPM_TOKEN}' > .npmrc

# Re-use npm cache to speedup installs

# Usually, "npm install" only runs when package.json changes due to cached
# layers. However, with .node_modules.tar.gz the cache will be busted twice
# after package.json changes (instead of just once) because
# the next docker build will see a different .node_modules.tar.gz

# This still speeds things in general, because npm install
# from scratch is slow. Also, it definitely speed things up
# in environments that don't benefit from cached image layers
# like CircleCI
COPY .node_modules.tar.gz /app/
# Only extract if file not empty
RUN test -s .node_modules.tar.gz \
  && tar xzf .node_modules.tar.gz -C /app \
  && echo "Extracted .node_modules.tar.gz to /app/node_modules" \
  || true

## Install app dependencies
COPY package.json /app/
ARG NPM_TOKEN
# make sure to remove unused packages in node_modules (in case we use a cache)
RUN npm prune
RUN npm install

# Bundle app source
COPY . /app

# Build
RUN npm run build

# If we don't remove this, npm start will fail if NPM_TOKEN is not set
# at run time
RUN rm .npmrc

EXPOSE 5000

CMD [ "npm", "start" ]

Basically, the Dockerfile copies and extracts .node_modules.tar.gz which (if it exists) should be the node_modules directory from a previous docker build. And I automatically generate .node_modules.tar.gz when I build my images:

.PHONY: docker-build
# IMAGE := someimage/sometag
docker-build:
    # create empty .docker-npm-cache.tar.gz if not exists, otherwise,
    # COPY will fail in Dockerfile
    ls .node_modules.tar.gz || touch .node_modules.tar.gz
    docker build --pull \
        --build-arg NPM_TOKEN=${NPM_TOKEN} \
        -t ${IMAGE} .
    make docker-cache-node-modules

.PHONY: docker-cache-node-modules
# Extracts node_modules from the image so it is cached in subsequent builds
docker-cache-node-modules:
    rm -rf .node_modules.tar.gz
    docker create ${IMAGE} > .docker-id
    # -n flag required to make resulting file deterministic
    docker cp `cat .docker-id`:/app/node_modules - | gzip -n > .node_modules.tar.gz
    docker rm -v `cat .docker-id`
    rm .docker-id

I can share the whole Makefile if anyone's interested

olalonde commented Oct 21, 2016

I just came up with this trick to speed up npm installs in docker (also plays well with CI/CD servers that don't cache image layers):

Dockerfile

# vim:set ft=dockerfile:
FROM node:6

# Create app directory
RUN mkdir -p /app
WORKDIR /app

# Create .npmrc (required for private registry)
RUN echo '//registry.npmjs.org/:_authToken=${NPM_TOKEN}' > .npmrc

# Re-use npm cache to speedup installs

# Usually, "npm install" only runs when package.json changes due to cached
# layers. However, with .node_modules.tar.gz the cache will be busted twice
# after package.json changes (instead of just once) because
# the next docker build will see a different .node_modules.tar.gz

# This still speeds things in general, because npm install
# from scratch is slow. Also, it definitely speed things up
# in environments that don't benefit from cached image layers
# like CircleCI
COPY .node_modules.tar.gz /app/
# Only extract if file not empty
RUN test -s .node_modules.tar.gz \
  && tar xzf .node_modules.tar.gz -C /app \
  && echo "Extracted .node_modules.tar.gz to /app/node_modules" \
  || true

## Install app dependencies
COPY package.json /app/
ARG NPM_TOKEN
# make sure to remove unused packages in node_modules (in case we use a cache)
RUN npm prune
RUN npm install

# Bundle app source
COPY . /app

# Build
RUN npm run build

# If we don't remove this, npm start will fail if NPM_TOKEN is not set
# at run time
RUN rm .npmrc

EXPOSE 5000

CMD [ "npm", "start" ]

Basically, the Dockerfile copies and extracts .node_modules.tar.gz which (if it exists) should be the node_modules directory from a previous docker build. And I automatically generate .node_modules.tar.gz when I build my images:

.PHONY: docker-build
# IMAGE := someimage/sometag
docker-build:
    # create empty .docker-npm-cache.tar.gz if not exists, otherwise,
    # COPY will fail in Dockerfile
    ls .node_modules.tar.gz || touch .node_modules.tar.gz
    docker build --pull \
        --build-arg NPM_TOKEN=${NPM_TOKEN} \
        -t ${IMAGE} .
    make docker-cache-node-modules

.PHONY: docker-cache-node-modules
# Extracts node_modules from the image so it is cached in subsequent builds
docker-cache-node-modules:
    rm -rf .node_modules.tar.gz
    docker create ${IMAGE} > .docker-id
    # -n flag required to make resulting file deterministic
    docker cp `cat .docker-id`:/app/node_modules - | gzip -n > .node_modules.tar.gz
    docker rm -v `cat .docker-id`
    rm .docker-id

I can share the whole Makefile if anyone's interested

@cachemoney

This comment has been minimized.

Show comment
Hide comment
@cachemoney

cachemoney Jan 8, 2017

@olalonde +1 for the whole Makefile

cachemoney commented Jan 8, 2017

@olalonde +1 for the whole Makefile

@olalonde

This comment has been minimized.

Show comment
Hide comment
@olalonde

olalonde Jan 9, 2017

# This is used for CI/CD deployment, not needed locally
# NPM_TOKEN must be set
# AWS_ACCOUNT_ID or PRIVATE_REGISTRY must be set
SHELL = /bin/bash

CIRCLE_PROJECT_REPONAME ?= $(shell basename "${PWD}")
IMAGE_PREFIX ?=
MUTABLE_VERSION ?= canary
VERSION ?= git-$(shell git rev-parse --short HEAD)
PRIVATE_REGISTRY ?= ${AWS_ACCOUNT_ID}.dkr.ecr.us-west-1.amazonaws.com
DOCKER_BUILD_RM ?= true

SHORT_NAME:= ${CIRCLE_PROJECT_REPONAME}
IMAGE := ${PRIVATE_REGISTRY}/${IMAGE_PREFIX}${SHORT_NAME}:${VERSION}
MUTABLE_IMAGE := ${PRIVATE_REGISTRY}/${IMAGE_PREFIX}${SHORT_NAME}:${MUTABLE_VERSION}

info:
	@echo "Build tag:       ${VERSION}"
	@echo "Immutable image: ${IMAGE}"
	@echo "Mutable image:   ${MUTABLE_IMAGE}"

build: docker-build
push: docker-push
release: docker-push-mutable

.PHONY: run
run:
	docker run -d -p 5000:5000 ${IMAGE}

.PHONY: run-interactive
run-interactive:
	docker run -it ${IMAGE} bash

.PHONY: run-test
run-test:
	npm run docker:bootstrap
	docker-compose run test

.PHONY: test
test: docker-build run-test

.PHONY: docker-build
docker-build:
	# create empty .node_modules.tar.gz if not exists, otherwise,
	# COPY will fail in Dockerfile
	ls .node_modules.tar.gz || touch .node_modules.tar.gz
	docker build --pull \
		--rm=${DOCKER_BUILD_RM} \
		--build-arg NPM_TOKEN=${NPM_TOKEN} \
		-t ${IMAGE} .
	rm .node_modules.tar.gz
	docker tag ${IMAGE} ${SHORT_NAME}:test

.PHONY: docker-cache-node-modules
# Extracts node_modules from the image so it is cached in subsequent builds
docker-cache-node-modules:
	rm .node_modules.tar.gz || true
	docker create ${IMAGE} > .docker-id
	# -n flag required to make resulting file deterministic
	docker cp `cat .docker-id`:/app/node_modules - \
		| gzip -n > .node_modules.tar.gz.part \
		&& mv .node_modules.tar.gz.part .node_modules.tar.gz
	docker rm -v `cat .docker-id` || true
	rm .docker-id

.PHONY: docker-push
docker-push:
	docker push ${IMAGE}

.PHONY: docker-push-mutable
docker-push-mutable:
	docker tag ${IMAGE} ${MUTABLE_IMAGE}
	docker push ${MUTABLE_IMAGE}

olalonde commented Jan 9, 2017

# This is used for CI/CD deployment, not needed locally
# NPM_TOKEN must be set
# AWS_ACCOUNT_ID or PRIVATE_REGISTRY must be set
SHELL = /bin/bash

CIRCLE_PROJECT_REPONAME ?= $(shell basename "${PWD}")
IMAGE_PREFIX ?=
MUTABLE_VERSION ?= canary
VERSION ?= git-$(shell git rev-parse --short HEAD)
PRIVATE_REGISTRY ?= ${AWS_ACCOUNT_ID}.dkr.ecr.us-west-1.amazonaws.com
DOCKER_BUILD_RM ?= true

SHORT_NAME:= ${CIRCLE_PROJECT_REPONAME}
IMAGE := ${PRIVATE_REGISTRY}/${IMAGE_PREFIX}${SHORT_NAME}:${VERSION}
MUTABLE_IMAGE := ${PRIVATE_REGISTRY}/${IMAGE_PREFIX}${SHORT_NAME}:${MUTABLE_VERSION}

info:
	@echo "Build tag:       ${VERSION}"
	@echo "Immutable image: ${IMAGE}"
	@echo "Mutable image:   ${MUTABLE_IMAGE}"

build: docker-build
push: docker-push
release: docker-push-mutable

.PHONY: run
run:
	docker run -d -p 5000:5000 ${IMAGE}

.PHONY: run-interactive
run-interactive:
	docker run -it ${IMAGE} bash

.PHONY: run-test
run-test:
	npm run docker:bootstrap
	docker-compose run test

.PHONY: test
test: docker-build run-test

.PHONY: docker-build
docker-build:
	# create empty .node_modules.tar.gz if not exists, otherwise,
	# COPY will fail in Dockerfile
	ls .node_modules.tar.gz || touch .node_modules.tar.gz
	docker build --pull \
		--rm=${DOCKER_BUILD_RM} \
		--build-arg NPM_TOKEN=${NPM_TOKEN} \
		-t ${IMAGE} .
	rm .node_modules.tar.gz
	docker tag ${IMAGE} ${SHORT_NAME}:test

.PHONY: docker-cache-node-modules
# Extracts node_modules from the image so it is cached in subsequent builds
docker-cache-node-modules:
	rm .node_modules.tar.gz || true
	docker create ${IMAGE} > .docker-id
	# -n flag required to make resulting file deterministic
	docker cp `cat .docker-id`:/app/node_modules - \
		| gzip -n > .node_modules.tar.gz.part \
		&& mv .node_modules.tar.gz.part .node_modules.tar.gz
	docker rm -v `cat .docker-id` || true
	rm .docker-id

.PHONY: docker-push
docker-push:
	docker push ${IMAGE}

.PHONY: docker-push-mutable
docker-push-mutable:
	docker tag ${IMAGE} ${MUTABLE_IMAGE}
	docker push ${MUTABLE_IMAGE}
@ORESoftware

This comment has been minimized.

Show comment
Hide comment
@ORESoftware

ORESoftware May 4, 2017

symbolic link is smart solution, also NODE_PATH might work

ORESoftware commented May 4, 2017

symbolic link is smart solution, also NODE_PATH might work

@npm-robot

This comment has been minimized.

Show comment
Hide comment
@npm-robot

npm-robot Jun 17, 2017

We're closing this support issue as it has gone three days without activity. The npm CLI team itself does not provide support via this issue tracker, but we are happy when users help each other here. In our experience once a support issue goes dormant it's unlikely to get further activity. If you're still having problems, you may be better served by joining package.community and asking your question there.

For more information about our new issue aging policies and why we've instituted them please see our blog post.

npm-robot commented Jun 17, 2017

We're closing this support issue as it has gone three days without activity. The npm CLI team itself does not provide support via this issue tracker, but we are happy when users help each other here. In our experience once a support issue goes dormant it's unlikely to get further activity. If you're still having problems, you may be better served by joining package.community and asking your question there.

For more information about our new issue aging policies and why we've instituted them please see our blog post.

@npm-robot npm-robot closed this Jun 17, 2017

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.