Specify depdencies using relative/absolute paths #1558

felixge opened this Issue Oct 17, 2011 · 29 comments


None yet

It would be nice if npm allowed specifying a relative / absolute path for dependencies.

For example:

  "dependencies": {
    "my-module": "../my-module"

Given a few pointers, I would love to submit a patch myself this time if you're interested in having this added.

npm member

I'm unsure what this gives you that npm link doesn't?

npm member

@isaacs sudo is required to use npm link (if you don't have +w on your global node_modules).


I'm unsure what this gives you that npm link doesn't?

I'd really like to use a private git repository for this, but for corporate bullshit reasons I can't. Npm link would work, but it would add another step towards deployment for something that seems like it should be reasonable to do within package.json.

I can understand if you don't want to support this, but I feel like it would make sense to have relative/absolute paths be valid package location identifiers.

npm member

Right, but you can always have a global location in your home dir or something, or use sudo. If you set it up to install a local folder, then it means a) it won't work on any other machine, and b) it will just install whatever's in that folder right now.

Why not just have the name:version in your dependencies array, and npm install ../module to satisfy it?

npm member

@felixge Could you just bundle it with the thing that depends on it?

It wouldn't be hard to add support for it. Right now, the logic is:

for thing in deps
  if (deps[thing] is a semver range) add thing@{deps[thing]}
  else if (deps[thing] is a url) add deps[thing]

So, the logic would change to:

for thing in deps
  if (deps[thing] is a semver range) add thing@{deps[thing]}
  else add deps[thing]

since install already supports tarballs and folders.

However, this is not supported on purpose, because of the fact that it would mean the package can't be installed on other machines, which opens up a whole pile of annoying deploy-time breakages.

Since you'd have to get the dependency on the machine first to make this work anyway, why not just put it there in the node_modules folder yourself, or submodule it, or install it and package it up as a bundledDependency, or install it globally and npm link it in?

npm member

If we were to add this (ie, if there's a use case that those other approaches actually don't support for whatever reason), then I'd like to maybe have lib/utils/read-json.js detect folder dependencies, and force private: true in those cases, so that we don't end up with locally-dependent packages getting published.


However, this is not supported on purpose, because of the fact that it would mean the package can't be installed on other machines, which opens up a whole pile of annoying deploy-time breakages.

People can write environment dependent package.json's already by specifying wrong urls, private git repos, or forgetting to include something. I don't think relative / absolute paths will make this better or worse. So while enforcing private: force would be no issue in my case, I don't quite see the point.

Since you'd have to get the dependency on the machine first to make this work anyway, why not just put it there in the node_modules folder yourself, or submodule it, or install it and package it up as a bundledDependency, or install it globally and npm link it in?

Well, I was planning to just put the package folder inside the same git repository the company is already using. Using git submodules is (besides the pain, oh the pain!) not an option as getting another repository would require physical paperwork and weeks of waiting time (I kid you not : /).

Anyway ... there are certainly workarounds available to me, so this is in no way critical. I was just surprised that relative/absolute paths are not working considering that they do work for npm install on the command line.


Ditto @felixge on his most recent comment...minus the physical paperwork, whoa.

Regardless of whether you use submodules or folders, the core of the problem I think is requiring sudo for a local install. For example, we use a project structure something like this:

  • ext
    • my-helper came from submodule at git://internalserver/my-helper.git
      • my-helper.js
      • package.json
  • my-app.js
  • package.json contains preinstall: "npm link ext/*"

It's pretty awesome to be able to just pull down the repo and submodule and run sudo npm install, but this is a local installation so it seems kind of strange to run it with sudo. The alternative for this example is using preinstall: "npm install ext/*", but then patching dependencies like "my-helper" requires a trip to the terminal to run npm uninstall my-helper && npm install.

My current workaround for that example is overriding the global prefix option in the preinstall script: preinstall: "npm link ext/* --prefix=node_modules". For now this seems to work okay, and this missing feature pretty much does not detract from my overall enjoyment of npm anyway :P But it would be pretty sweet to see npm support it more directly.


@isaacs in our company we can not install git on production servers. The only thing we can do is to prepare independent archive (tar.gz) that could be exploded on production servers. Gcc is not installed on production so every node_modules has to be present and compiled in the archive.

Our packages are private and have many dependencies on other private packages, and it would be useful to describe those dependencies with relative paths.

why not just put it there in the node_modules folder yourself,

This is what we are doing. In the package.json we add a custom key : myCompanyDependencies, and a build script iterates on those dependencies to npm install them in node_modules

or submodule it,

We can't use git

or install it and package it up as a bundledDependency,

This is what we did (item 1)

or install it globally and npm link it in?

The operation team do not want global install for applications. And applications do not have root access.


do "npm link ext/* --prefix=node_modules"

It's great for development but for production our packages have to be independent.

So we have something working with a custom build script but it would be easier for us, to use relative paths in the package.json dependencies. It's not a problem for us to force private:true.

I am willing to do the pull-request if you are interested in this feature. (BTW you gave all the solutions to make it :) )

npm member

You don't need git on production servers in order to get benefit from checking your modules into git.

npm install blah
git add node_modules/blah
git commit -m "Update blah to version watever"
# dev some more...

# some time later...
# make a release
git tag -s v1.2.3
git archive --format=tar --prefix=my-app-v1.2.3/ v1.2.3 | gzip > my-app-v1.2.3.tar.gz
scp my-app-v1.2.3.tar.gz production-box.com:
ssh production-box.com "./deploy.sh my-app-v1.2.3.tar.gz" #or however you do it

This way, you have git's byte-level tracking of dep sources, and npm's convenience of fetching and managing deps, but without needing git or npm to be on the production machine.

npm member

You could also just drop your stuff in a node_modules folder somewhere above where your app is running, if you want to update deps piecemeal on an integration/staging machine.

The node module system is incredibly flexible. There's very little you can't do with it.


Basically what we do for production is:

cd $app
for service in  `ls $app/service-*/*`; do               # we have 10 services
   cd $app/$service 
   rm -fr node_modules
   npm install

    # We read in package.json the private key ourCompanyDependencies
    for privateLib in `listPrivateDeps $app/$service/package.json`; do               # we have a total 20 libs
      cd $app/$privateLib 
      npm install

      # manual cp the libs in the node_modules folder of the service
      cp -r $app/$privateLib $app/$service/node_modules

    tar cvzf $app/$service-`date -I`.tar.gz $app/$service
    scp ....


For dev we use npm link, it's easier.

But with relative/absolute implemented in npm, it would become :

cd $app
for service in  `ls $app/service-*/*`; do               # we have 10 services
   cd $app/$service 
   rm -fr node_modules
   npm install

    tar cvzf $app/$service-`date -I`.tar.gz $app/$service
    scp ....

And I wrote you a simplified version of our real script :)

We can't use git archive because we want the node_modules to be present in the archive.


Manually adding libs in node_module provoke errors when using shrinkwrap.

npm shrinkwrap
npm ERR! Error: Problems were encountered
npm ERR! Please correct and try again.
npm ERR! message extraneous: service-utils@0.0.0 /home/user/product/service-processing/node_modules/service-utils
npm ERR! message extraneous: service-errors@0.0.0 /home/user/product/service-processing/node_modules/service-errors
npm ERR! message extraneous: service-object-cleaner@0.1.0 /home/user/product/service-processing/node_modules/service-object-cleaner
npm ERR! message extraneous: service-models@1.0.0 /home/user/product/service-processing/node_modules/service-models
npm ERR! message extraneous: service-conf@0.0.0 /home/user/product/service-processing/node_modules/service-conf
npm ERR! message extraneous: service-logger@0.0.0 /home/user/product/service-processing/node_modules/service-logger

All the private libs that I manually copied in node_module are rejected.

npm member

Shrinkwrap only works on modules that can be installed from the registry. It doesn't bundle your deps for you automatically.

Do this:

  1. npm install /path/to/whatever -S to save it as a dependency
  2. In your package.json, add "bundleDependencies":["all", "the", "names", "of", "private", "things"] to bundle them in your package.tgz.

Then shrinkwrap will work.

@isaacs isaacs closed this May 1, 2012
@isaacs isaacs reopened this May 1, 2012

Ok thanks


I'm late to the party here, but saw this labeled isaacs-says-yes -- is that indeed the case? =D

I'd love to see this too, though I can share what we do to work around this today -- we just check symlinks into node_modules. E.g.:

- shared
- component1 (e.g. website)
  - node_modules/shared -> ../../shared
- component2 (e.g. worker)
  - node_modules/shared -> ../../shared

Then the code in both components can just require('shared').

I too can't npm link, because the server doesn't have root access. Platforms like Heroku also don't support deploying tarball. But checking in the symlink lets me get the best of both worlds in this case -- "live" but also bundled dependencies.

But it'd be nicer if I could include the dependency in package.json (can I still, e.g. w/ bundledDependencies?). And it'd also be a bit more convenient if I could just rm -rf node_modules without having to then undelete the symlinks too.

At the end of the day, this isn't critical -- there are workarounds -- but it'd certainly be convenient. Thanks for the consideration!

tj commented May 11, 2012

sounds similar to what I want in #2442 there. Without require.paths (or forcing NODE_PATH on people) we cant have private modules that don't live in a registry. Today I forgot to un-gitignore one in ./node_modules and did a git clean -fd completely removing it haha so I really dont want these to live in ./node_modules ideally


I would also love a solution that doesn't require running an internal git server, internal npm repository, or adding tarball creation as a deployment step. We have a project layout for our internal code like this:


Where app1/2/3 all need to require('baselibrary'). Right now we're using relative paths for the require statements, ie require('../baselibrary/index') but it's a pain to make sure each statement has the correct relative path. Being able to require('baselibrary') like a normal npm module would be a big win.


For posterity, I resolve this using the following technique:

Our directory structure:

/server/services/foo #separate processes
/server/lib #shared server code

The Problem:

Our code is riddled with very brittle ugly relative requires, like the following from a service's lib directory (/server/services/foo/lib/bar.js):

var UserModel = require('../../../lib/model/user')

Our goal:

  • Mimic either the functionality from this issue or a local equivalent to NODE_PATH (per project, using node_modules, non-global)
  • Don't require explicitly linking each module
  • Don't checkin node_modules: http://blog.nodejs.org/2012/02/27/managing-node-js-dependencies-with-shrinkwrap/
  • Don't require separate git (or npm) repos
  • No tarballs
  • Run server not as root
  • Don't require root to deploy (no sudo npm link, sudo npm update, etc...)
  • KISS: Avoid unnecessary deploy steps (i.e., tarballs). Our current deploy involves git pull & a recursive npm update.

Steps to success:

  1. In unix/linux, create a new group.
  2. Add server user to that group.
  3. Chown /usr/local/lib/node_modules recursively to that group.
  4. Run npm init or create a minimal package.json in the directories you want to add shortcuts for.
  5. Add npm link calls to root package.json's postinstall and postupdate scripts for all directory shortcuts:
// /server/package.json
"scripts": {
  "postinstall": "npm link ./lib/; npm link ../shared/;",
  "postupdate": "npm link ./lib/; npm link ../shared/;"


var UserModel = require('lib/model/user'),
    SOME_CONST = require('shared/constant/someconst')
npm member

Fiddling with the permissions like that seems like a complicated and less sane version of npm set prefix ~/npm. Also, you should probably check out npm install --link.


@nathan7 Thanks for tip. We should probably be using prefix instead of chown'ing the install directory to avoid sudo.

Also, thanks for the tip on npm install --link. I think there are some issues that would prevent in being the best solution in this case, but I was unaware of its existence and it would come in handy for some personal projects. I'm not so sure npm install --link accomplishes what we want though. It's hard to say though since documentation is sparse and a little vague.


The --link argument will cause npm to link global installs into the local space in some cases.

It seems that npm install --link would link all available dependencies, which is undesirable as we'd only really want the relative/local project-specific ones. At any rate, when it's all said and done with, this is not so much about whether or not it can be done, but what the right way to do it is. Relative-require soup gets the job done, but is inelegant and ugly. Manually linking individual packages is a headache, even after you address the permissions issue.

One option listed here, a preinstall hook with npm link foo/* solves the individual manual linking issue, but replaces it with another, potential name collisions across multiple directories. For instance, in our case, we may have a /server/lib/foo and a /shared/lib/foo. While this could be addressed by renaming one, I find our solution of linking to parent directories to be more descriptive and less error prone, at least in this regard.

Ideally there would be a more elegant/declarative style to address internal dependencies other than manual management, globals or relative require soup.


FWIW... The way I solved this need was to add a links section to my package.json:

"links": [

Then, since I'm already using Cake, I changed my task for updating my module dependencies to be:

task 'update-modules', 'update dependencies from npm', ->
    shell [
        'rm -rf node_modules'
        'npm update'
    fs = require 'fs'
    links = JSON.parse(fs.readFileSync('package.json', {encoding:'utf-8'})).links
    for link in links
        shell "npm link #{link}"

(shell is a little function which runs shell scripts and displays their output)

The extra links section is ignored by npm, so it's no problem for running npm by itself.


I don't mind if private: true is forced.

Today we allow for a url to a tarball, and we allow publishing to npm even if that url is behind a firewall or 404's. So allowing local file urls won't be any worse than our current behavior.

This will help us with tests that require npm installing built-on-the-fly packages. Today we have to tarball those packages and serve them via a local connect http server just because npm won't allow us to give it a relative path in the package.json.


Why not give require support for paths like '//somewhere/from/root'? The "//" would mean from root (root being the directory where the process was started, the initial module). Then anywhere in an app, you could have the same base paths.

Alternative paths for these "basepaths", could be ".../lib/someModule".

This way, I could have a lib folder at the root directory of my project, and I could use require('//lib/mylib') from anywhere to get the same module.


So, good news: @dylang went ahead and implemented this feature, more or less, in #5629. Because it's new, it's still a little rough around the edges, but if you read the docs (npm help package.json, the section on local paths under dependencies – one note that isn't in there is that --save works with local paths, and prefixes them as file: URLs), it should be relatively straightforward to start playing with them. Any issues found with them should be treated as new bugs and filed as such.

Thanks to everyone for their input and (three years' worth of) patience!

@othiym23 othiym23 closed this Sep 23, 2014

Please provide an example how and where I can specify local dependencies:

this doesn't work for instance:

devDependencies {
  "foobar": "." 

And this installs the last version via npmjs.org

devDependencies {
  "foobar": "./." 

using npm 2.1.3

@othiym23 othiym23 added bug feature-request and removed bug labels Oct 16, 2014

@timaschew from the docs:

Local Paths

As of version 2.0.0 you can provide a path to a local directory that contains a package. Local paths can be in the form:


This feature is helpful for local offline development and creating tests that require npm installing where you don't want to hit an external server, but should not be used when publishing packages
to the public registry.

There is one additional restriction, in that you should only use those paths at the prompt: npm install --save /foo/bar. In package.json, you should use the file: prefix instead: file:../../foo/bar. A patch to the docs clarifying this would be welcome (and in the absence of a patch, one of us will probably get around to it eventually).


Not sure, if this topic is exactly what I need, but here is my problem:
My project directory structure looks like this:


Now the problem is, that if I do npm install in the project directory, than it will install node_modules in to project, and I need it to install it in to project/server. How can I get this done, preferably only by editing package.json, and in a way, that works on Windows and Linux as well?

npm member

@hdf You're describing an unrelated question. In the future, or if this reply isn't enough to answer the question, please go ahead and open a new issue.

The package.json file must always be parallel to the place where npm installs things. We're not going to change that. The good news is that if you let it install things into project/node_modules then the code in server/index.js will be able to load those things just fine.

If the issue is that project/server and project/www need different dependencies, then you actually have two modules trying to share a single home, and that's just a bad scene. If that's the case, I'd recommend splitting the thing into three modules: server, www, and project, where project lists server and www as dependencies. (You may need to workshop these names a bit, or use namespaced modules like @hdf/www and @hdf/server.)

If I'm misunderstanding your need, please do create a new issue by clicking the green "New Issue" button in the upper-right corner of this page. Thanks

@isaacs isaacs locked and limited conversation to collaborators Apr 24, 2015
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.