On Mon, Aug 25, 2014 at 8:27 PM, Bill McCloskey <wmcclos...@mozilla.com> wrote:
> Even if a full no-op build took no time, partial builds are still useful. 
> Often my workflow looks something like this:
>
> change files in directory D
> rebuild only D, get a list of errors to fix
> ...iterate until no more errors
> try to rebuild a few related directories, fixing errors there
> then rebuild the entire tree, hopefully without errors
>
> Often the changes to directory D are to header files that are included all 
> over the tree. If I could only do full rebuilds, I would have to wait for a 
> bunch of unrelated files to compile before I could see if the directory I'm 
> interested in works. Then, if fixing an error required making a change to one 
> of the header files again, I would have to wait for a ton of files to 
> recompile again. A process that would have taken 20 minutes could be drawn 
> out into an hour.
>
> This happens really often when changing the JS engine or xpconnect, since 
> files like jsapi.h, jsfriendapi.h, and xpcpublic.h are included almost 
> everywhere.

FWIW, I've often made changes like this when touching files like
nsCOMPtr.h or nsINode.h -- or switching nsresult from a typedef to an
enum class! -- and I find just doing ./mach build binaries works fine.
It reports errors randomly from all over the tree, but that works fine
for me.  What's the advantage of doing it directory-by-directory?
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to