Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Typescript : JavaScript heap out of memory #48

Closed
VI-Lukas opened this issue Feb 23, 2018 · 9 comments
Closed

Typescript : JavaScript heap out of memory #48

VI-Lukas opened this issue Feb 23, 2018 · 9 comments
Labels
Milestone

Comments

@VI-Lukas
Copy link

VI-Lukas commented Feb 23, 2018

Start with gulp serve.

After view times webpack:ts runs it takes longer and longer and at some point running into:

Starting 'webpack:ts'...
<--- Last few GCs --->

[13936:0000020A1C0B82A0] 2906109 ms: Mark-sweep 1410.9 (1461.0) -> 1410.9 (1461.0) MB, 1618.4 / 0.0 ms last resort GC in old space requested

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 000003CDEB725EE1
2: /* anonymous /(aka / anonymous */) [C:<projectPath>\node_modules\typescript\lib\typescript.js:20797] [bytecode=0000012474DB12A9 offset=3](this=000001CA07782311 ,ElementKind=
0000012985E5CA71 )
4: bindObjectLiteralExpression(aka bindObjectLiteralExpression) [C:\Users...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node_module_register
2: v8::internal::FatalProcessOutOfMemory
3: v8::internal::FatalProcessOutOfMemory
4: v8::internal::Factory::NewTransitionArray
5: v8::internal::EhFrameIterator::DecodeSLeb128
6: v8::internal::JSReceiver::GetOwnPropertyDescriptor
7: v8::internal::JSReceiver::GetOwnPropertyDescriptor
8: v8::internal::JSReceiver::GetOwnPropertyDescriptor
9: v8::internal::JSReceiver::class_name
10: v8::internal::JSReceiver::GetOwnPropertyDescriptor
11: v8::internal::LookupIterator::PrepareTransitionToDataProperty
12: std::vector<v8::internal::compiler::MoveOperands * __ptr64,v8::internal::ZoneAllocator<v8::internal::compiler::MoveOperands * __ptr64> >::_Reallocate
13: std::vector<v8::internal::compiler::MoveOperands * __ptr64,v8::internal::ZoneAllocator<v8::internal::compiler::MoveOperands * __ptr64> >::_Reallocate
14: std::vector<v8::internal::compiler::MoveOperands * __ptr64,v8::internal::ZoneAllocator<v8::internal::compiler::MoveOperands * __ptr64> >::_Reallocate
15: std::vector<v8::internal::compiler::MoveOperands * __ptr64,v8::internal::ZoneAllocator<v8::internal::compiler::MoveOperands * __ptr64> >::_Reallocate
16: 0000014AEA2847A1

@janrembold
Copy link
Contributor

janrembold commented Mar 19, 2018

This is not a fix but it helps to reduce reloading the build framework.
Add parameter --max_old_space_size=4096 (or even higher memory value if you have enough RAM) to your "scripts"->"start" property, see https://github.com/biotope/biotope/blob/master/package.json#L18

Other possibility would be to use pump as proposed here #59
Maybe this helps to better clean up memory consumed by webpack-stream (or webpack itself). But this is just a guess...

@janrembold
Copy link
Contributor

I removed manual error listener and added plumber as global error listener in webpack-stream task. Maybe this helps to better exit the task and do correct memory de-allocation. Please test: #68 or use https://github.com/biotope/biotope-build.git#uglify-pump-test as @biotope/build target path in package.json (this is only for projects > 5.2.x)

@janrembold
Copy link
Contributor

@SheepFromHeaven According to our discussion about the configuration of webpack entry points I created a PoC branch that uses the configuration and not the gulp pipe anymore to inject TS files into webpack-stream. Could you please test this branch "https://github.com/biotope/biotope-build/tree/webpack-config-entry-points" on your project files for performance and more important for this memory leak. This branch depends on the latest 5.3.2 branch.

@janrembold
Copy link
Contributor

@SheepFromHeaven Small Update: The entry points task seems to work better in general but has still memory issues.

Next two ideas to solve this annoying bug are:

  1. Check if webpack-stream is creating issues when it is run in parallel (overlapping) processes
  2. Update to webpack 4

@janrembold
Copy link
Contributor

@SheepFromHeaven - Today I did lots of research to hunt down this annoying bug and all I found out was that ts-loader might (?!) be involved. So I removed ts-loader and added awesome-typescript-loader and also enabled its cache in local tsconfig.json:

"awesomeTypescriptLoaderOptions": {
  "useCache": true
},

In demo-5.x branch everything runs great and extremely fast. I wasn't able to reproduce the memory leak here. But this branch isn't optimal. I usually use our web-components-future branch, because with that code it's really simple to reproduce the bug. Downside is that I get strange redux 4 typing errors there. :(

Could you please change your biotope-build to branch https://github.com/biotope/biotope-build.git#atl and check this typescript error. Afterwards it would be nice to get some "oh yeah - no more memory leak found" feedback from you!!!!!! :D

@janrembold
Copy link
Contributor

@SheepFromHeaven No worries. Adding another line to tsconfig.json fixes this error. Should be handled by includes/excludes but isn't. My webpack task is now running between 700ms up to 1,4s without any memory issues!!!!! But no party before Windows is checked...

"awesomeTypescriptLoaderOptions": {
		"useCache": true,
		"reportFiles": [
			"src/**/*.{ts,tsx}"
		]
},

@SheepFromHeaven
Copy link
Member

@janrembold checked it too. Redux errors went away after upgrading the version.
Runs perfectly smooth and no memory leaks until now!

Love this solution. Fast and steady.

@timomayer
Copy link
Member

Sounds am good, thanks so much @janrembold has someone checked it on windows?

@janrembold
Copy link
Contributor

Fixed with #97

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants