Niels Thykier <ni...@thykier.net> writes: > The html_reports process itself consumes up to 2GB while processing > templates. It is possible that there is nothing we can do about that > as there *is* a lot of data in play. But even then, we can free it as > soon as possible (so we do not keep it while running gnuplot at the > end of the run).
I think the code currently takes a very naive approach and loads the entire state of the world into memory, and Perl's memory allocation is known to aggressively trade space for speed. If instead it stored the various things it cared about in a local SQLite database, it would be a bit slower, but it would consume much less memory. I bet the speed difference wouldn't be too bad. And this would have the possibly useful side effect of creating a SQLite database full of interesting statistics that one could run rich queries against. -- Russ Allbery (r...@debian.org) <http://www.eyrie.org/~eagle/> -- To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org