On Fri, 30 Jan 2015 17:38:40 -0800 Russ Allbery <r...@debian.org> wrote: > Niels Thykier <ni...@thykier.net> writes: > > > The html_reports process itself consumes up to 2GB while processing > > templates. It is possible that there is nothing we can do about that > > as there *is* a lot of data in play. But even then, we can free it as > > soon as possible (so we do not keep it while running gnuplot at the > > end of the run). > > I think the code currently takes a very naive approach and loads the > entire state of the world into memory, and Perl's memory allocation is > known to aggressively trade space for speed. > > If instead it stored the various things it cared about in a local SQLite > database, it would be a bit slower, but it would consume much less > memory. I bet the speed difference wouldn't be too bad. And this would > have the possibly useful side effect of creating a SQLite database full of > interesting statistics that one could run rich queries against. > > -- > Russ Allbery (r...@debian.org) <http://www.eyrie.org/~eagle/> > >
Hi Russ (and others), I have been considering the expand the scope of the reporting framework to include testing and to de-tangle suite-related data in the reports (i.e. get a separate report for each suite). If I am to add that, I think a database might be the only realistic way forward for that with all the bells and whistles we currently have. However, I do not really have a lot of experience with Perl database frameworks, so I could use some help here. I suspect it would make sense to obsolete the harness state cache as well (the YAML file), so most tools only need to deal with the database. FTR, I would probably ask for a postgres database on lindsay.d.o. That said, SQLite support would be great for local testing. Thanks, ~Niels