Hi, I think that checksumming might benefit some targets. It would be nice to be able to implement different "methods" for different targets - because not all methods work well in all circumstances.
I have one example where every single file in a huge build includes 1 particular header file. The file defines macros which are the features that are enabled or disabled in the build. We know which features are used by particular components so in theory we could work out not to rebuild components that are not influenced by what's happened to the header file. e.g. we could switch on a feature or add a new feature without forcing a rebuild of the entire source base. This requires something like md5 but also some kind of "filter" to determine what kinds of changes are significant to the particular target that you are testing the dependency for You can emulate md5 checksum dependencies in make of course, using temporary marker files, but it's a bit ugly and complicated.. Regards, Tim 2009/9/29 Giuseppe Scrivano <gscriv...@gnu.org>: > Philip Guenther <guent...@gmail.com> writes: > >> (Have you measured how often this sort of thing would save >> recompilation and/or relinking and how much time it would save then? >> What's the comparison to how much time would be spent calculating the >> checksums? If it saves a minute once every 100 compiles but costs a >> second in each of those, then it's a net loss...) > > I don't have numbers but I think it can save a lot of time in the > linking phase, that is *really* slow. > > Best, > Giuseppe > > > > _______________________________________________ > Bug-make mailing list > Bug-make@gnu.org > http://lists.gnu.org/mailman/listinfo/bug-make > -- You could help some brave and decent people to have access to uncensored news by making a donation at: http://www.thezimbabwean.co.uk/ _______________________________________________ Bug-make mailing list Bug-make@gnu.org http://lists.gnu.org/mailman/listinfo/bug-make