Do you have a large angular application that takes too long to compile / serve / compile tests? This little thing helped me:
Increasing the node process memory
By default, a node process can take up to 1,76GB of RAM. This is by original design of Javascript, where the assumption was, that a single thread (which node has) will not exceed a total of 2GB RAM.
However, things got a bit more complex the last few years, because JavaScript frameworks evolved and allowed to compose different libraries to a framework or in our case - the angular compile mechanism.
And if your compilation process reaches this limit, the node garbage collector will start alternating with the compilation process, freeing up just enough space so that the next piece of work can be executed by ngc.
If you're like me, you might think: Why the hell should ngc ever exceed this limit?!?
The short answer is: Because it does many things for you π
Okay, to be a bit more specific:
- it bundles assets into your final build
- it keeps several hundred files in memory to do tree-shaking
- it compiles scss-files
- it organizes chunks into module / vendor chunks
- it seperates often used code pieces for shared use into a common chunk
- ...and much much more :)
So how do we get rid of that "free memory / do a bit more work / free memory"-cycle?
Increasing the node memory limit with max_old_space_size
Node has a simple flag to change the maximum RAM consumption before the garbage collector will start to agressively free up memory. You can use it like
node --max_old_space_size=
size in MB
As I have 24GB of RAM in my machine, I don't have any problem to assign 8 GB to the node process. But I also don't want to lose the comfort of the ng cli, so how to automatically assign the parameter to ng commands?
The solution is simple, but maybe not too obvious:
We call the ng cli directly out of the node_modules folder using a node script. I called this variant "nghm" (for ng high memory) and built it like this:
"nghm": "node --max_old_space_size=8096 ./node_modules/@angular/cli/bin/ng"
Having this script in mind, we can transform "ng serve" into
npm run nghm -- serve
which will now consume up to 8GB RAM. A production build could look like
npm run nghm -- build --prod --progress=false --aot=true --vendor-chunk
And the Numbers?
Well, that might depend from project to project, but in my project this particular change has reduced the compilation time from ~3:26min to ~1:36min (I picked 2 times from our CI system, that are pretty average - no science hereπ).
It might be influenced by how large your assets are, how many files are compiled and so on, but if you struggle with long compilation times, just give it a try.
Originally published at marcel-cremer.de
Top comments (10)
I believe you can use an .npmrc file at the project level and avoid altering your scripts altogether
Thank you for your input about npmrc and export guys :)
I personally don't like the export solution as it's global, but its definetly worth mentioning (for ppl who want to alter the size always).
About the .npmrc file: I honestly didn't know that it's also possible to have this on project level (only in the home folder). I'll give this one a try too.
If you want to check how much RAM is node allocating currently, you can use this:
Does anyone know how I would get this to work in Gitlab CI?
I've tried using NODE_OPTIONS in my gitlab-ci.yml and added node-options=--max-old-space-size=2048 to the .npmrc file in the project root, but whenever I do npm run-script build-prod it just reverts back to 80000 and my build never finishes on my machine. I'm using my own gitlab-runner.
I'm really at a loss.
Any help would be appreciated. Thanks.
Hmm, will definitely check this out!
Thanks for the article <3
So simple yet so effective. I like it!
It definitely seems good, but something about this has me expecting my house to explode.
Man you have a huge RAM. I'm stilling playing with 6GB Ram π
I have 32GB RAM can't I increase the max_old_space_size? (more than 8096)