On Sun, Oct 18, 2015 at 8:37 PM, Boris Zbarsky <bzbar...@mit.edu> wrote:

> On 10/18/15 7:14 PM, Nicholas Nethercote wrote:
>
>> Eventually |mach build| should just do the right
>> thing, no matter what files you've touched...
>>
>
> The problem is that definitions of "right thing" differ depending on the
> goal, right?
>

Exactly. It's not clear to me that even the perfect build system would
solve this problem (unless it were perfectly fast, in which case I don't
care about any of this). The C++ compilation model requires that we rebuild
everything that #includes a header file that changed, which can take a long
time, so I use my wetware capabilities to short-circuit that heavy rebuild
when I know it isn't needed.

I know this may void my warranty in general, but (as Jonas points out) the
only thing I really care about is making sure that the stuff I rebuilt gets
linked, which IMO is something the build system should be able to
guarantee. This may be simpler than it used to be now that (most?)
everything is linked into libxul, but I think there are still various
corner cases in the tree (correct me if I'm wrong).

I've heard two compelling arguments against the current setup:
(1) It's inconsistent.
(2) It causes unnecessary overhead when you do |mach build foo bar| because
we link between foo and bar.

for (1), I would be satisfied with a stripped down version of dumbmake that
doesn't do any building, but _does_ guarantee to link everything. (2) could
be solved with a flag to |mach build| to indicate whether or not the link
step should be performed (though I think linking should be the default).

bholley
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to