Skip to content
This repository has been archived by the owner. It is now read-only.

feature request: force install from cache. #2568

Closed
dominictarr opened this issue Jun 26, 2012 · 81 comments
Closed

feature request: force install from cache. #2568

dominictarr opened this issue Jun 26, 2012 · 81 comments

Comments

@dominictarr
Copy link

@dominictarr dominictarr commented Jun 26, 2012

npm install module --cache

would behave as if there is not network connection.

this would be very useful when the connection is slow,
and you know the modules you are installing well.

it would also create the possibility for a background process that tails npm
and keeps your cache up to date. making installs super fast!

@isaacs
Copy link
Member

@isaacs isaacs commented Jun 27, 2012

Hm. This used to work as npm install --no-registry, but apparently that's broken now.

@isaacs
Copy link
Member

@isaacs isaacs commented Jun 27, 2012

Oh, you can also set the cache-min config value to something greater than 0, and then nothing younger than that number of seconds will be fetched, ever.

npm install --cache-min 999999 would effectively be what you're asking for.

@dominictarr
Copy link
Author

@dominictarr dominictarr commented Jun 27, 2012

thanks!

@robashton
Copy link

@robashton robashton commented Jul 11, 2012

Is no-registry going to be fixed? cache-min still appears to ping the registry to ask what the times are and that's undesirable?

@isaacs
Copy link
Member

@isaacs isaacs commented Jul 11, 2012

Ack, indeed, this was broken with the npm-registry-client refactor. Fixed on 81fa33a.

@robashton
Copy link

@robashton robashton commented Jul 11, 2012

Cool thanks :-)

@isaacs
Copy link
Member

@isaacs isaacs commented Jul 11, 2012

Also, --no-registry will be fixed on the next npm version (included in node 0.8.3 and 0.9.0)

@felixrabe
Copy link
Contributor

@felixrabe felixrabe commented Jul 26, 2013

There is no mention of --no-registry in npm help install.

@luk-
Copy link
Contributor

@luk- luk- commented Nov 13, 2013

This is still broken.

@digitalmaster
Copy link

@digitalmaster digitalmaster commented Dec 11, 2013

Any word on the status of this flag?

@dominictarr
Copy link
Author

@dominictarr dominictarr commented Dec 11, 2013

by the way, this works with npmd if you use npmd install foo --offline

@grahamlyons
Copy link

@grahamlyons grahamlyons commented Dec 12, 2013

I've seen this fail using a command like this:

npm install --no-registry path/to/package.tgz

But this succeeds:

npm install --no-registry --cache path/to/cache/dir

That's perfect for my use case but I don't know if it's how it's intended to work.

This was with version 1.2.18.

@dominictarr
Copy link
Author

@dominictarr dominictarr commented Dec 13, 2013

@grahamlyons you are using quite an old version of npm. yes, installing from a file path needs to point to a directory.

@mikermcneil
Copy link

@mikermcneil mikermcneil commented Dec 16, 2013

@grahamlyons in npm v1.3.17, the following fails now:
npm install ejs --no-registry --cache /Users/mike/.npm

As does:
npm install ejs --no-registry
https://github.com/isaacs/npm/issues/3691#issuecomment-30626058

@domenic
Copy link
Contributor

@domenic domenic commented Dec 16, 2013

This is in my notes from JSConf EU as something we are worse at than bower. We should make it more reliable to install while offline. But, probably as part of the big ol' cache refactor.

@mikermcneil
Copy link

@mikermcneil mikermcneil commented Dec 16, 2013

@domenic I'll help if I can :) (I'm trying to refactor sails new to use what npm already provides and avoid reinventing the wheel- just ran into this along the way. Checked out npmd and looks really promising- can't add it to our deps though :\ )

@isaacs Btw-- cache-min works great in 1.3.17, both from the CLI and programmatically (it's a little slower than doing copies, but presumably that's because it's properly checking on descendant dependencies in the modules)

So, for our friends who might be googling this in the future:

Install npm dependencies from cache (i.e. offline, no connection)

Tested in npm v1.3.17

CLI
npm install ejs sails-disk optimist grunt --cache-min 999999999
Programmatically
var npm = require('npm');

var modules = ['ejs', 'grunt', 'sails-disk', 'optimist'];

npm.load({
    loglevel: 'silent',
    'cache-min': 999999999999
}, function(err) {
    if (err) throw err;

    console.time('npm install');
    npm.commands.install(modules, function(err, data) {
        if (err) throw err;
        console.timeEnd('npm install');
    });
});
@mikermcneil
Copy link

@mikermcneil mikermcneil commented Dec 16, 2013

posting this here in case it helps anyone else

So ran into one more problem-- including npm as a dependency is pretty expensive as far as weight and install-time for your module. What would be great is to use a user's local npm which is already installed-- which could probably be achieved through some sort of npm trickery I couldn't figure out-- but for the short term, I put this together:

https://npmjs.org/package/enpeem

It's a very thin wrapper around using require('child-process').exec to access the globally installed npm on the system. It tries doing an npm -v first to make sure something version-esque gets printed to stdout, and if it doesn't it fails with a somewhat helpful message explaining that npm isn't available, and so the dependencies will need to be manually installed.

Of course, the jackpot would be if npm could be require()'d without having to npm install itself-- but in the meantime, this gets the job done.

@timoxley
Copy link
Contributor

@timoxley timoxley commented Jan 7, 2014

Would like this feature for npm search as well npm/npm-registry-client#35

@ruyadorno
Copy link
Member

@ruyadorno ruyadorno commented Jan 23, 2014

An --offline and/or --cache flag should definitively be a thing, these are much more descriptive and intuitive than this hacky --cache-min 9999999 solution.

@majgis
Copy link

@majgis majgis commented Mar 14, 2014

In my case, setting cache-min had no effect. A locally installed karma was causing npm install --no-registry to fail. Installing karma globally fixed the problem.

@addyosmani
Copy link

@addyosmani addyosmani commented Apr 20, 2014

I've tried to advocate the use of --cache-min 9999999 but due to this not being that trivial to commit to memory/use often, I still see value in a proper --offline flag being baked in. --no-registry is still broken afaik.

@aramk
Copy link

@aramk aramk commented Jun 19, 2014

It should reduce a lot of time if npm checked the cache before making so many network requests when installing libraries. Adding --offline would be nice.

zkat added a commit that referenced this issue Apr 20, 2017
Fixes: #2568
Fixes: #2649
Fixes: #3141
Fixes: #4042
Fixes: #4652
Fixes: #5357
Fixes: #5509
Fixes: #5622
Fixes: #5941

All fetching-related networking is now done through pacote, and
the old cache has been entirely replaced by a cacache-based one.

Features:

* npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification.

* An `integrity` field has been added to `npm-shrinkwrap.json`.

* Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail.

* If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached.

* Checksum integrity failures will now retry downloading on error, instead of failing on a single check.

* A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache.

* npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction.

* packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other.

* Some performance improvements.

* Improved fetch retry logic will try harder to download your packages.

BREAKING CHANGE: many shrinkwrap and cache-related things have changed.

* Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one.

* `npm cache ls` has been removed for now

* `npm cache rm` now always removes the entire cache. There is no granular removal available for now.

* git dependencies can now use semver resolution using `#semver:^1.2.3`

* `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively.

* npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache.

* `prepublishOnly` now runs *before* packing the tarball.

* npm no longer supports node@<4.
zkat added a commit that referenced this issue Apr 22, 2017
Fixes: #2568
Fixes: #2649
Fixes: #3141
Fixes: #4042
Fixes: #4652
Fixes: #5357
Fixes: #5509
Fixes: #5622
Fixes: #5941

All fetching-related networking is now done through pacote, and
the old cache has been entirely replaced by a cacache-based one.

Features:

* npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification.

* An `integrity` field has been added to `npm-shrinkwrap.json`.

* Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail.

* If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached.

* Checksum integrity failures will now retry downloading on error, instead of failing on a single check.

* A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache.

* npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction.

* packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other.

* Some performance improvements.

* Improved fetch retry logic will try harder to download your packages.

BREAKING CHANGE: many shrinkwrap and cache-related things have changed.

* Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one.

* `npm cache ls` has been removed for now

* `npm cache rm` now always removes the entire cache. There is no granular removal available for now.

* git dependencies can now use semver resolution using `#semver:^1.2.3`

* `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively.

* npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache.

* `prepublishOnly` now runs *before* packing the tarball.

* npm no longer supports node@<4.
zkat added a commit that referenced this issue Apr 23, 2017
Fixes: #2568
Fixes: #2649
Fixes: #3141
Fixes: #4042
Fixes: #4652
Fixes: #5357
Fixes: #5509
Fixes: #5622
Fixes: #5941

All fetching-related networking is now done through pacote, and
the old cache has been entirely replaced by a cacache-based one.

Features:

* npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification.

* An `integrity` field has been added to `npm-shrinkwrap.json`.

* Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail.

* If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached.

* Checksum integrity failures will now retry downloading on error, instead of failing on a single check.

* A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache.

* npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction.

* packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other.

* Some performance improvements.

* Improved fetch retry logic will try harder to download your packages.

BREAKING CHANGE: many shrinkwrap and cache-related things have changed.

* Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one.

* `npm cache ls` has been removed for now

* `npm cache rm` now always removes the entire cache. There is no granular removal available for now.

* git dependencies can now use semver resolution using `#semver:^1.2.3`

* `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively.

* npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache.

* `prepublishOnly` now runs *before* packing the tarball.

* npm no longer supports node@<4.
zkat added a commit that referenced this issue Apr 27, 2017
Fixes: #2568
Fixes: #2649
Fixes: #3141
Fixes: #4042
Fixes: #4652
Fixes: #5357
Fixes: #5509
Fixes: #5622
Fixes: #5941

All fetching-related networking is now done through pacote, and
the old cache has been entirely replaced by a cacache-based one.

Features:

* npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification.

* An `integrity` field has been added to `npm-shrinkwrap.json`.

* Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail.

* If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached.

* Checksum integrity failures will now retry downloading on error, instead of failing on a single check.

* A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache.

* npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction.

* packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other.

* Some performance improvements.

* Improved fetch retry logic will try harder to download your packages.

BREAKING CHANGE: many shrinkwrap and cache-related things have changed.

* Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one.

* `npm cache ls` has been removed for now

* `npm cache rm` now always removes the entire cache. There is no granular removal available for now.

* git dependencies can now use semver resolution using `#semver:^1.2.3`

* `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively.

* npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache.

* `prepublishOnly` now runs *before* packing the tarball.

* npm no longer supports node@<4.

fix(doctor): updated doctor command and its tests
@zkat
Copy link
Contributor

@zkat zkat commented Jun 12, 2017

npm5 supports this now through the --offline flag, which will error if the stuff you were trying to install isn't already in the cache.

You can also use --prefer-offline to use the cache as much as possible, and hit the network only when a dependency cannot otherwise be fulfilled.

@zkat zkat closed this Jun 12, 2017
@ruyadorno
Copy link
Member

@ruyadorno ruyadorno commented Jun 13, 2017

that was a long and crazy ride, thanks for the update @zkat ❤️

@danshome
Copy link

@danshome danshome commented Jun 26, 2017

@zkat
Copy link
Contributor

@zkat zkat commented Jul 1, 2017

@danshome if you have a package-lock.json, yes. npm will not touch the network at all if you have a package-lock.json and a warm cache, unless you have git dependencies (that's coming, too). You don't need to do anything special for this: it'll Just Work™ with no extra settings/flags.

@npm npm deleted a comment from mosermanuel Sep 11, 2017
@npm npm deleted a comment from vog Sep 11, 2017
@tommedema
Copy link

@tommedema tommedema commented Sep 24, 2017

@zkat why is --prefer-offline not a default option? seems to make more sense than to redownload something that is already on disk

@zkat
Copy link
Contributor

@zkat zkat commented Sep 25, 2017

@tommedema because I think you're misunderstanding how --prefer-offline works.

Again, if you've previously downloaded the package, and the previously downloaded version is also the latest version, you don't download anything at all.

What --prefer-offline does is mainly prevent the 304 checks for the latest versions of a package: that is, if you do npm i foo, you want to make sure the latest version of that is what's installed. Those are the semantics people are used to. With --prefer-offline, you'll never even check if there's an up to date version, so unless you specify npm i foo@5.4.3, you'll never get that version.

Tarballs are only ever downloaded once if they're already cached. We don't even do 304 checks for those if we have integrity information for them. And yes, this is the case even if you don't have a package-lock.json.

So to answer your question in bullet points:

  • because --prefer-offline breaks expected semantics for users by always giving them maximally-stale data.
  • because --prefer-offline does not have any effect on whether a tarball is re-downloaded.
  • because the only network access --prefer-offline prevents you from doing consistently is 304 checks for package medatata (NOT tarballs).
  • because the network access --prefer-offline limits is relatively negligible, and is only useful when you're in network situations where you have data caps (because the requests are so small, they're unlikely to have much of an effect on speed at all). Seriously, the only reason I put the option in is for users in places like Australia, or people using cellular routers.
  • because in 99% of cases for users installing with a package-lock.json, --prefer-offline is a NOOP and will do absolutely nothing whatsoever of value for you. At all. Nada. Zilch. Nothing.
@zkat
Copy link
Contributor

@zkat zkat commented Sep 25, 2017

And for the sake of it, I want to clarify the point of --offline as well:

  • npm@5 will automatically install from cache if it doesn't detect network, or if your router lacks external access. So if you want to use npm on a plane, you can just use it. You don't need --offline. It'll Just Work™.
  • --offline will ONLY be useful when you want to guarantee that no network access is done, and you want npm to crash when it doesn't have something in cache (this does not include git dependencies right now). So, this is what you use if you've hit your data cap and want to make damn sure npm doesn't suck up more data. Or, like, if you're temporarily tethering off your phone.
@tommedema
Copy link

@tommedema tommedema commented Sep 25, 2017

@zkat that makes perfect sense, thanks for the elaborate explanation. Much appreciated

rzr added a commit to TizenTeam/openembedded-core that referenced this issue Nov 2, 2017
Option --no-registry seems deprecated or even non supported for ages,
while --offline fixed the problem on install task.

Issue can be reproduced using:

  devtool add "npm://registry.npmjs.org;name=epoll;version=latest"
  bitbake epoll

  | DEBUG: Executing shell function do_install
  (...)
  | npm ERR! argv ".../node" ".../npm" "install" (...) "--production" "--no-registry"
  | npm ERR! node v6.11.0
  | npm ERR! npm  v3.10.10
  | npm ERR! registry URL is required

And also from log file ".../epoll/1.0.0-r0/npmpkg/npm-debug.log":

   silly mapToRegistry using default registry
   41 silly mapToRegistry registry null
   42 verbose stack AssertionError: registry URL is required
   42 verbose stack     at Conf.getCredentialsByURI (.../get-credentials-by-uri.js:8:3)

More relevent insights:
npm/npm#2568

Signed-off-by: Philippe Coval <philippe.coval@osg.samsung.com>
@acklenx
Copy link

@acklenx acklenx commented Apr 23, 2018

I love the explanations and the described behavior. I am of course here because it is not what I observe.

$ npm -v
5.6.0
$ npm install puppeteer --prefer-offline

puppeteer@1.3.0 install /home/acklenx/temp/node_modules/puppeteer
node install.js

Downloading Chromium r549031 - 97 Mb [====================] 100% 0.0s
Chromium downloaded to /home/acklenx/temp/node_modules/puppeteer/.local-chromium/linux-549031

  • puppeteer@1.3.0
    updated 1 package in # 56.38s

And it does the same big, slow, download when I repeat the command, with or without the --prefer-offline

Is there a setting that I missed to enable this awesome caching behavior that I have desperately wanted for years?

@jalcine
Copy link

@jalcine jalcine commented Apr 23, 2018

@jalcine
Copy link

@jalcine jalcine commented Apr 23, 2018

Just highlighting @ackalker, that's an issue of both Puppeteer not respecting/checking for NPM/Yarn offline checks and you potentially not checking the docs for Puppeteer ;) https://github.com/GoogleChrome/puppeteer/blob/master/docs/api.md#environment-variables

@acklenx
Copy link

@acklenx acklenx commented Apr 24, 2018

:) Thank you. I did not check the docs well enough. I really wanted the npm behavior for all things installed with npm ;) - ( where puppeteer would check for a new version of chrome and pull it if it's newer than the cached version). I doesn't look like the environment variable offers that kind of flexibility - it's all or nothing.

Terribly sorry to bother you with my own failure to read (on a closed issue no less).

@zkat
Copy link
Contributor

@zkat zkat commented Apr 24, 2018

@acklenx puppeteer itself can hook into this behavior by checking for npm_config_prefer_offline, npm_config_prefer_online, and npm_config_offline environment variables (process.env), when running this install. That way, they can integrate their install script with npm's own behavior (on npm5 and later).

@ORESoftware
Copy link

@ORESoftware ORESoftware commented May 14, 2018

is there a way to configure NPM to use some packages from the cache and some packages from the registry? looking to this for local development instead of npm link.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
You can’t perform that action at this time.