On Monday 2014-08-25 10:27 -0700, Bill McCloskey wrote:
> Even if a full no-op build took no time, partial builds are still useful. 
> Often my workflow looks something like this:
> 
> change files in directory D
> rebuild only D, get a list of errors to fix
> ...iterate until no more errors
> try to rebuild a few related directories, fixing errors there
> then rebuild the entire tree, hopefully without errors

A similar problem that's been bugging me lately is that if I use
"mach build binaries" and it hits a compiler error, if I go fix the
error and rerun "mach build binaries", I have no idea how long I
need to wait to see if I even fixed the error successfully, since
the rebuild builds things in a different order.  The
mostly-deterministic order that the build system used to provide was
an advantage for that case.

I tend to work around this by building in specific directories
because then I'll know that I'm past the part of the build that's
likely to give an error.  And I'll often do this first, because then
I'm building the directory where I modified .cpp files, which is
likely to catch all the errors, even if the .h files I modified
require rebuilding much of the tree.

On the other hand, if the build system automatically prioritized
.cpp dependencies ahead of .h dependencies and just automatically
built the things most likely to break first, I wouldn't have to
worry about this at all.

-David

-- 
𝄞   L. David Baron                         http://dbaron.org/   𝄂
𝄢   Mozilla                          https://www.mozilla.org/   𝄂
             Before I built a wall I'd ask to know
             What I was walling in or walling out,
             And to whom I was like to give offense.
               - Robert Frost, Mending Wall (1914)

Attachment: signature.asc
Description: Digital signature

_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to