On Sun, 9 Apr 2017, Markus Trippelsdorf wrote:

> The minimum size heuristic for the garbage collector's heap, before it
> starts collecting, was last updated over ten years ago.
> It currently has a hard upper limit of 128MB.
> This is too low for current machines where 8GB of RAM is normal.
> So, it seems to me, a new upper bound of 1GB would be appropriate.

While amount of available RAM has grown, so has the number of available CPU
cores (counteracting RAM growth for parallel builds). Building under a
virtualized environment with less-than-host RAM got also more common I think.

Bumping it all the way up to 1GB seems excessive, how did you arrive at that
figure? E.g. my recollection from watching a Firefox build is that most of
compiler instances need under 0.5GB (RSS).

> Compile times of large C++ projects improve by over 10% due to this
> change.

Can you explain a bit more, what projects you've tested?.. 10+% looks
surprisingly high to me.
 
> What do you think?

I wonder if it's possible to reap most of the compile time benefit with a bit
more modest gc threshold increase?

Thanks.
Alexander

Reply via email to