Not too long ago, I made a statement like "64 GB should be enough for anybody 
;-)" (when talking about a new Lenovo micro computer).

It turns out I was wrong,  Over on the "Libre-Soc General Development" <libre-
soc-...@lists.libre-soc.org> maillist (where they are working on building a 
new basically "open source" microprocessor, iiuc), they have been talking 
about the requirements to compile the Verilog source for part of the project.  
Here are some quotes:

<quotes from various emails>
<from the project lead:>
as an experiment i extracted the vhdl using yosys, into a verilog file for
compilation with verilator, which is over 1,000,000 lines long.

from that 1,000,000 line file verilator has produced 3,000 sub-files.
compiling even the ones with a #include and nothing else takes 10 minutes
each.  estimates for completion of compilation is therefore several weeks.

this is not reasonable unless we have access to a beowulf cluster with
distcc.

<next>

> > about 200 cores across dozens of machines, each with between 128 and 512
> GB
> > of RAM.

<next -- this is for a local machine to link those binaries after compilation>

> > if you do not have a machine with absolutely mental amounts of RAM (like,
64 GB or above) we may be able to arrange something.

do not under any circumstances try linking massive binaries once distcc
gets the object files onto your machine.   if you go into swap space
(during linking), by mistake it will cause your machine to melt.  loadavg
120 or above is not uncommon.

</quotes from various emails>

I hope I have not caused anyone problems by my misleading statement.



Reply via email to