New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Process out of memory - Webpack #1914

Closed
andrewhathaway opened this Issue Jan 22, 2016 · 111 comments

Comments

Projects
None yet
@andrewhathaway
Copy link

andrewhathaway commented Jan 22, 2016

After a couple of builds running in a Watch, Webpack crashes due to being out of memory.

<--- Last few GCs --->

 9223585 ms: Scavenge 1390.0 (1454.7) -> 1390.0 (1454.7) MB, 9.0 / 0 ms (+ 0.7 ms in 1 steps since last GC) [allocation failure] [incremental marking delaying mark-sweep].
 9224727 ms: Mark-sweep 1390.0 (1454.7) -> 1388.3 (1452.7) MB, 1142.7 / 0 ms (+ 9.8 ms in 89 steps since start of marking, biggest step 2.5 ms) [last resort gc].
 9225694 ms: Mark-sweep 1388.3 (1452.7) -> 1388.8 (1454.7) MB, 966.6 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x35903c44a49 <JS Object>
    1: walkFunctionDeclaration [<PROJECT_DIR>/node_modules/webpack/lib/Parser.js:~443] [pc=0xa07a14ec8ee] (this=0x59f67991119 <a Parser with map 0x71f2d115d49>,statement=0x3507a80af661 <a Node with map 0x71f2d1157c9>)
    2: walkStatement [<PROJECT_DIR>/node_modules/webpack/lib/Parser.js:~348] [pc=0xa07a06dfc10] (this=0x59f6799111...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Abort trap: 6

@bebraw bebraw added the bug label Jan 22, 2016

@SpaceK33z

This comment has been minimized.

Copy link
Member

SpaceK33z commented Jan 22, 2016

I'm having this too with webpack-dev-server on Debian; but I haven't had time to debug this further.
Do you have a minimal webpack config where you can reproduce this? It could also be an external loader or plugin.

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

Don't use hashes when watching. i. e. [chunkhash] causes all assets to be stored.

@SimenB

This comment has been minimized.

Copy link
Contributor

SimenB commented Jan 23, 2016

@sokra In the documentation for css-loader it says to use hash for easier debugging wrt css modules. Does that have the same problem?

You can configure the generated ident with the localIdentName query parameter (default [hash:base64]). Example: css-loader?localIdentName=[path][name]---[local]---[hash:base64:5] for easier debugging.

We've also started to see this problem in dev (out of memory), but our only hash in dev mode is the css one.

EDIT: Of course it uses hash by default, so I'm guessing it's not a problem.

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

Does that have the same problem?

No. Only filenames.

You can compare it with webpack --watch. If this generates many files in to your output path, the webpack-dev-server generates many files in the memory-fs.

If this is not the issue, you can increase the node.js memory (it defaults to 1.7 GB, which can be too few for big builds).

@SimenB

This comment has been minimized.

Copy link
Contributor

SimenB commented Jan 23, 2016

I'll test at work on Monday! If the outputted file is of the same name as the old one, then it's overwritten, right?

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

jup.

You can navigate to http://localhost:8080/webpack-dev-server to see a list of all files in memory.

@SimenB

This comment has been minimized.

Copy link
Contributor

SimenB commented Jan 23, 2016

Ah, didn't know that! Using a small project I have on my own computer, I notice that all of the hot-update files gets added.

image

Could they be a problem? Any way of clearing them out after they've been there for x amount of time? If we develop for a few hours there's probably quite a few of them in the end

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

Could they be a problem?

They could, but they are usually pretty small.

@SpaceK33z

This comment has been minimized.

Copy link
Member

SpaceK33z commented Jan 23, 2016

I think clearing them out like @SimenB said would be a nice option, or maybe there is some way to delete all in-memory files that are no longer in use?

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

jup. Write a plugin...

@sokra

This comment has been minimized.

Copy link
Member

sokra commented Jan 23, 2016

But deleting them too early can break stuff...

@deepakmani

This comment has been minimized.

Copy link

deepakmani commented Apr 26, 2016

We have the exact same memory leak problem after using webpack -w on MacOS X. Can you please suggest what is a the best way to fix / workaround this issue. The ngOfficeUIFabric seems to deal with a different problem.

Thanks,
Deepak

@luchillo17

This comment has been minimized.

Copy link

luchillo17 commented Apr 26, 2016

Have same issue of @deepakmani in OSX El Capitan, however i'm using Ionic 2 which in it's config doesn't use css-loader, instead it just uses ionic-gulp-sass-build, so ikd if the error happens because of webpack or that gulp plugin, also i'm not using HRM:

const path = require('path')
const PrefetchPlugin = require('webpack/lib/PrefetchPlugin')
const ProvidePlugin = require('webpack/lib/ProvidePlugin')
const DedupePlugin = require('webpack/lib/optimize/DedupePlugin')
const UglifyJsPlugin = require('webpack/lib/optimize/UglifyJsPlugin')
const WebpackNotifierPlugin = require('webpack-notifier')


module.exports = {
  entry: {
    polyfills: './app/polyfills.ts',
    vendor: './app/vendor.ts',
    app: path.resolve('app/app')
  },
  output: {
    path: path.resolve('www/build/js'),
    filename: '[name].bundle.js',
    publicPath: "/",
    chunkFilename: "[id].bundle.js",
    pathinfo: false, // show module paths in the bundle, handy for debugging
  },
  devtool: 'eval',
  module: {
    loaders: [
      {
        test: /\.ts$/,
        loader: 'awesome-typescript',
        query: {
          doTypeCheck: true,
          resolveGlobs: false,
          externals: ['typings/main.d.ts']
        },
        include: path.resolve('app'),
        exclude: /node_modules/
      }
      ,{
        test: /\.js$/,
        include: path.resolve('node_modules/angular2'),
        loader: 'strip-sourcemap'
      }
    ],
    noParse: [
      /es6-shim/,
      /reflect-metadata/,
      /zone\.js(\/|\\)dist(\/|\\)zone/
    ]
  },
  plugins: [
    new PrefetchPlugin('rxjs'),
    new ProvidePlugin({
      Chart: 'chart.js/Chart.min.js'
    }),
    new DedupePlugin()
    // ,new UglifyJsPlugin({
    //    // to debug prod builds uncomment //debug lines and comment //prod lines
    //   // beautify: true,//debug
    //   // mangle: false,//debug
    //   // dead_code: false,//debug
    //   // unused: false,//debug
    //   // deadCode: false,//debug
    //   // compress : { screw_ie8 : true, keep_fnames: true, drop_debugger: false, dead_code: false, unused: false, }, // debug
    //   // comments: true,//debug

    //   beautify: false,//prod
    //   // disable mangling because of a bug in angular2 beta.1, beta.2 and beta.3
    //   // TODO(mastertinner): enable mangling as soon as angular2 beta.4 is out
    //   // mangle: { screw_ie8 : true },//prod
    //   mangle: false,
    //   // dead_code : true,
    //   compress : { 
    //     dead_code : true,
    //     screw_ie8 : true
    //   },//prod
    //   comments: false//prod
    // })
    ,new WebpackNotifierPlugin({
      title: 'tao_app_ionic',
      alwaysNotify: true
      // contentImage: path.join(appDir, 'images/notifier.png')
    })
  ],
  resolve: {
    alias: {
      'angular2': path.resolve('node_modules/angular2')
    },
    extensions: ['', '.js', '.ts']
  }
};
@guruprasanth5

This comment has been minimized.

Copy link

guruprasanth5 commented May 26, 2016

I am having the same issue, any hack that can fix this for now.

joschi added a commit to Graylog2/graylog2-server that referenced this issue Jun 29, 2016

Fix memory problems with webpack-dev-server in dev mode (#2433)
Before this webpack was using hashes in filenames for the generated
files which are stored in memory. That meant on every change new files
have been generated and stored in memory, quickly filling up the
available memory.

With this change we do not use hashes for filenames in development mode
to avoid filling up memory.

Refs webpack/webpack#1914

joschi added a commit to Graylog2/graylog2-server that referenced this issue Jun 29, 2016

Fix memory problems with webpack-dev-server in dev mode (#2433)
Before this webpack was using hashes in filenames for the generated
files which are stored in memory. That meant on every change new files
have been generated and stored in memory, quickly filling up the
available memory.

With this change we do not use hashes for filenames in development mode
to avoid filling up memory.

Refs webpack/webpack#1914
(cherry picked from commit 3ab5918)
@thunderkid

This comment has been minimized.

Copy link

thunderkid commented Oct 4, 2016

Having the same problem running webpack via webpack-stream in gulp. I use gulp.watch to rerun on code changes, which then just runs

    return gulp.src(tsStartFile)
              .pipe(webpackStream(webPackConfig))
              .on('error', logError) 
              .pipe(gulp.dest(outputDir))

I'd have thought this would run webpack from scratch each time, but after 5 or so builds I get the out of memory error and have to rerun my gulp file.

@Pines-Cheng

This comment has been minimized.

Copy link

Pines-Cheng commented Oct 17, 2016

Did someone resolved this problem?

ivanov added a commit to rgbkrk/nteract that referenced this issue Nov 10, 2016

@tdsmithATabc

This comment has been minimized.

Copy link

tdsmithATabc commented Nov 15, 2016

Another data point here; I have filename hashes disabled and my build isn't terribly large (it uses Angular 2, but few other dependencies), but after about an hour of dev work the watch dies with Abort trap: 6.

I've temporarily increased the memory limit, which gives me a lot more time, but I'd rather not be consuming gigabytes just to run a watch command.

FYI if anyone is curious about upping the memory limit, I did it like so in my package.json (obviously the additional flags are dependent on your needs):

{
  "scripts": {
    "start": "node --max_old_space_size=4096 node_modules/webpack-dev-server/bin/webpack-dev-server.js --inline --progress --port 3000 --open"
  }
}
@mikiest

This comment has been minimized.

Copy link

mikiest commented Nov 17, 2016

Having the same issue as @thunderkid here. Any update?

@rochdev

This comment has been minimized.

Copy link

rochdev commented Jan 11, 2017

Same issue for a very small project

@johntran

This comment has been minimized.

Copy link

johntran commented Jun 28, 2018

If you're using ts-loader, adding tsconfig.json to root folder fixed it for me

@athornton

This comment has been minimized.

Copy link

athornton commented Jun 30, 2018

This has suddenly started happening to me during Jupyter Lab extension builds.

@saulshanabrook

This comment has been minimized.

Copy link

saulshanabrook commented Jul 2, 2018

Does anyone have tips on how to profile what is using memory during a webpack build?

@bgever

This comment has been minimized.

Copy link

bgever commented Jul 3, 2018

Just like you can set the explicit memory boundary as indicated above, you can also activate the node debugger. Then you can easily attach to it from Chrome DevTools and start profiling. This is how I also uncovered a circular loop issue (in our own code), by CPU profiling.

@sibelius

This comment has been minimized.

Copy link

sibelius commented Jul 3, 2018

Eslint can detect circular dependencies

Does circular dependencies causes troubles?

@firestar300

This comment has been minimized.

Copy link

firestar300 commented Jul 20, 2018

Same probleme here. I use Webpack (Dev Server) with Pug templates and HTMLWebpackPlugin.

 94% asset optimization
<--- Last few GCs --->

[5239:0x102801e00]   213342 ms: Mark-sweep 1364.4 (1412.2) -> 1364.2 (1412.2) MB, 104.2 / 0.0 ms  allocation failure GC in oldspace requested
[5239:0x102801e00]   213448 ms: Mark-sweep 1364.2 (1412.2) -> 1364.2 (1379.2) MB, 106.5 / 0.0 ms  last resort GC in old space requested
[5239:0x102801e00]   213542 ms: Mark-sweep 1364.2 (1379.2) -> 1364.2 (1379.2) MB, 94.3 / 0.0 ms  last resort GC in old space requested


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x15d329da55e9 <JSObject>
    2: replace(this=0x15d3dcff1d09 <Very long string[21267994]>,0x15d3ddf66541 <String[32]: var HTML_WEBPACK_PLUGIN_RESULT =>,0x15d333582431 <String[0]: >)
    3: evaluateCompilationResult [/Volumes/APPLE SD Card/Projets/.../node_modules/html-webpack-plugin/index.js:240] [bytecode=0x15d3ddf668c1 offset=40](this=0x15d3c80275d1 <HtmlWebpackPlugin map = 0x...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
 1: node::Abort() [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 2: node::FatalTryCatch::~FatalTryCatch() [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 3: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 4: v8::internal::Factory::NewRawTwoByteString(int, v8::internal::PretenureFlag) [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 5: v8::internal::String::SlowFlatten(v8::internal::Handle<v8::internal::ConsString>, v8::internal::PretenureFlag) [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 6: v8::internal::String::IndexOf(v8::internal::Isolate*, v8::internal::Handle<v8::internal::String>, v8::internal::Handle<v8::internal::String>, int) [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 7: v8::internal::Runtime_StringIndexOfUnchecked(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/.../.nvm/versions/node/v9.11.2/bin/node]
 8: 0x1c0837c042fd
[1]    5238 abort      npm start
@liangwenzhong

This comment has been minimized.

Copy link

liangwenzhong commented Aug 17, 2018

any solution to this ?

@WhoAteDaCake

This comment has been minimized.

Copy link

WhoAteDaCake commented Aug 24, 2018

Still happening on webpack 4.17.1

@pangwa

This comment has been minimized.

Copy link

pangwa commented Aug 27, 2018

+1...

@buildbreakdo

This comment has been minimized.

Copy link

buildbreakdo commented Aug 28, 2018

What pushes a build over the 1400 MB ram allocated by node will be different for each project. There is no webpack bug here. However, there could be a bug in your build configuration (like compiling node_modules).

Or maybe, just maybe, the project is just that big and the build process requires more RAM than Nodes default 1400 MB. If you would like to increase the RAM allocated to Node see instructions above here: #1914 (comment)

@sibelius

This comment has been minimized.

Copy link

sibelius commented Aug 28, 2018

@buildbreakdo what if build works great, and then after some hot module reloads, it keeps using more memory and then crashes

is this a bug on hot module reload?

@damassi

This comment has been minimized.

Copy link

damassi commented Aug 28, 2018

I've been noticing a similar issue with HMR; eventually the process will time out.

@camsjams

This comment has been minimized.

Copy link

camsjams commented Aug 28, 2018

I had this issue a year ago (see above in July 2017).

Some things I've done to our project since then - this issue has gone away entirely, confirmed on Mac, Linux, and Windows 10 dev workstations:

  1. Upgraded to Node 8+
  2. Upgraded to webpack v4+ (version 3 was causing some issues for me)
  3. Upgraded to webpack-dev-middleware v3+
    • Ensure config for this is watchOptions: { poll: false }, stats: 'minimal' }
    • poll: false is important if you are using Docker 🐳 (I'm also using an alpine image for those who care node:8.11.4-alpine), it solves some performance hiccups
  4. If you are using eslint plugin, make sure it is v4+, especially if its part of webpack config, ensure any non-source files are ignored
    • node_modules
    • coverage
    • build
    • static

Also still using react-transform-hmr v1.0.4, no issues with this version.

Another final note: If you use WebStorm, I would disable "safe write" as it usually causes performance issues, most often when your hard drive is symlinked into a Docker container. Similar settings may exist in other IDEs.

For what it's worth, here is a snippet of the webpack config:

{
... { // other config stuff here}
module: {
		rules: [
			{
				test: /\.jsx?$/,
				exclude: /node_modules/,
				enforce: 'pre',
				use: [
					{
						loader: 'eslint-loader'
					}
				]
			},
			{
				test: /\.jsx?$/,
				exclude: /node_modules/,
				use: [
					{
						loader: 'babel-loader',
						options: {
							cacheDirectory: !isProduction // set elsewhere in file
						}
					}
				]
			}
		]
	},
	resolve: {
		extensions: ['.json', '.js', '.jsx']
	},
	plugins: [
		new webpack.IgnorePlugin(/^\.\/locale$/, /moment$/), // only if you use moment timezone
		new webpack.HotModuleReplacementPlugin() // only for dev mode
	]
}
@Pauan

This comment has been minimized.

Copy link

Pauan commented Sep 5, 2018

@buildbreakdo A better option is to do this:

"scripts": {
    "build": "cross-env NODE_OPTIONS=--max_old_space_size=8192 webpack"
}

(It uses the cross-env package.)

The nice thing about using NODE_OPTIONS is that it will affect everything, including any sub-processes that Webpack might spawn.

(It's useful for non-Webpack things as well: we're using it for TypeDoc, which consumes a lot of RAM.)

@tomquirk

This comment has been minimized.

Copy link

tomquirk commented Sep 12, 2018

+1. Is someone able to give a definitive solution outside of setting max_old_space_size?

@Jackson7683

This comment has been minimized.

Copy link

Jackson7683 commented Sep 17, 2018

@Pauan Your solution works for me :)

@garrettmaring

This comment has been minimized.

Copy link

garrettmaring commented Sep 27, 2018

Just adding my experience here. I did have issues with compilation on the t1.micro size but I did not on the t2.small.

@thunderkid

This comment has been minimized.

Copy link

thunderkid commented Sep 27, 2018

My problems appear to have gone away completely after upgrading to Webpack4 and eliminating gulp.

Previously I was using webpack3 via gulp and webpack-stream. I had to restart manually after every 4 compilations. I am now using Webpack4 directly with plugins as required - but no gulp involved - and everything keeps running smoothly for 100+ compilation cycles. I haven't done anything special with configuring space-size etc, but just built up a completely clean environment starting with the webpack/typescript getting-started examples.

@dmljc

This comment has been minimized.

Copy link

dmljc commented Nov 15, 2018

Another data point here; I have filename hashes disabled and my build isn't terribly large (it uses Angular 2, but few other dependencies), but after about an hour of dev work the watch dies with Abort trap: 6.

I've temporarily increased the memory limit, which gives me a lot more time, but I'd rather not be consuming gigabytes just to run a watch command.

FYI if anyone is curious about upping the memory limit, I did it like so in my package.json (obviously the additional flags are dependent on your needs):

{
  "scripts": {
    "start": "node --max_old_space_size=4096 node_modules/webpack-dev-server/bin/webpack-dev-server.js --inline --progress --port 3000 --open"
  }
}

核心代码:--max_old_space_size = 4096

@aprilmintacpineda

This comment has been minimized.

Copy link

aprilmintacpineda commented Jan 3, 2019

If you are using SSD, you can use writeToDisk option like so:

devServer: {
  host: process.env.HOST,
  port: process.env.PORT,
  overlay: true,
  writeToDisk: true,
  compress: true
}
@PlayMa256

This comment has been minimized.

Copy link
Member

PlayMa256 commented Jan 3, 2019

If you are using SSD, you can use writeToDisk option like so:

devServer: {
  host: process.env.HOST,
  port: process.env.PORT,
  overlay: true,
  writeToDisk: true,
  compress: true
}

This makes the bundle go directly to disk, to escape the use of memory-fs, which is leaky and flaky. That is basically what webpack-plugin-serve does. But even using webpack watch mode, leak still persist in some of the cases that were already reported above.

@aprilmintacpineda

This comment has been minimized.

Copy link

aprilmintacpineda commented Jan 4, 2019

Hmmm.. I don't experienced leak that's directly related to webpack or webpack-dev-server after using that configuration. But I did experienced a leak that's related to one of the webpack-loaders that I use that was misconfigured.

@sibelius

This comment has been minimized.

Copy link

sibelius commented Jan 4, 2019

Which loader?

@idegree

This comment has been minimized.

Copy link

idegree commented Jan 16, 2019

export NODE_OPTIONS=--max_old_space_size=4096

Works for me.

@Delagen

This comment has been minimized.

Copy link

Delagen commented Jan 16, 2019

@idegree I used this for a while, but this broke some programs in Windows. I took for about hour to detect that Geforce Experience does not start because of this environment variable.
I hope in Linux all ok )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment