With the clean build, i.e., after removing the MOZ_OBJ directory
manually, and
run ../mach configure; ../mach build
created TB binary successfully with the new 0.2.8-alpha.0.
On the subsequent rebuild AFTER I removed MOZ_OBJ directory,
sccache reported the following statistics and so presumably caching works.
(I have no idea what the ONE cache miss is about.)
++ sccache --show-stats
Compile requests 227
Compile requests executed 171
Cache hits 170
Cache misses 1 <--- Hmm?
Cache timeouts 0
Cache read errors 0
Forced recaches 0
Cache write errors 0
Compilation failures 0
Cache errors 0
Non-cacheable compilations 0
Non-cacheable calls 56
Non-compilation calls 0
Unsupported compiler calls 0
Average cache write 0.000 s
Average cache read miss 2.071 s
Average cache read hit 0.158 s
Cache location Local disk: "/KERNEL-SRC/sccache-dir"
Cache size 3 GiB
Max cache size 6 GiB
Now, what was the problem with the older 0.2.2 about?
I am not entirely sure since the error message from the old sccache was
not unreliable.
Sometimes it simply said it failed due to the communication failure as
noted in the original post.
But sometimes it mentions that Clang failed to write to output stream
(due to file system filling up.)
Well, I had several hundreds megabytes free for the object file
directories and so I doubt it.
But I relented and moved the cache to a different place just in case for
experiment. (The file system is used for storing linux kernel source and
it had about 25 GB free before sccache cache was moved there. I set the
upperlimit to 6GB to see if it had any bearings on this problem, but as
you can see the used cache is only 3GB and so the size limit is not an
issue as far as I could see.)
But what I think was relevant is the following message I got during my
transition to 0.2.8-alpha.0 version from 02.2.
I saw the following message from 0.2.8-alpha.0 but not from 02.2.2: I
only show the initial portion since the whole message line is very long.
Note the phrase "CannotCache" near the beginning.
DEBUG:sccache::server: parse_arguments: CannotCache(incremental):
["--crate-name", "style", "servo/components/style/lib.rs", "--color",
"always", "--crate-type", "lib", "--emit=dep-info,link", "-C",
"opt-level=1", "-C", "panic=abort", "-C", "debuginfo=2", "-C",
"debug-assertions=on", "--cfg", "feature=\"bindgen\"", "--cfg",
"feature=\"fallible\"", "--cfg", "feature=\"gecko\"", "--cfg",
"feature=\"gecko_debug\"", "--cfg", "feature=\"nsstring\"", "--cfg",
"feature=\"num_cpus\"", "--cfg", "feature=\"regex\"", "--cfg",
"feature=\"style_traits\"", "--cfg", "feature=\"toml\"", "--cfg",
"feature=\"use_bindgen\"", "-C", "metadata=fe694db6e1bf8757", "-C",
"extra-filename=-fe694db6e1bf8757", "--out-dir",
"/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps",
"--target", "x86_64-unknown-linux-gnu", "-C",
"linker=/NREF-COMM-CENTRAL/mozilla/build/cargo-linker", "-C",
"incremental=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/incremental",
"-L",
"dependency=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps",
"-L",
"dependency=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/debug/deps",
"--extern",
"unicode_bidi=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps/libunicode_bidi-4b3e26d03dd62274.rlib",
"--extern",
"fxhash=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps/libfxhash-368fd167e6509fc3.rlib",
"--extern",
"owning_ref=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps/libowning_ref-3730d01829ed25c1.rlib",
"--extern",
"unicode_segmentation=/NREF-COMM-CENTRAL/mozilla/objdir-tb3/toolkit/library/x86_64-unknown-linux-gnu/debug/deps/libunicode_segmentation-2d97389911c8b825.rlib",
...
So presumably when the rustc compiler does something about incremental
compiling (after I updated the source a day ago), it was not handled
very well by the 0.2.2 sscache, and it may even tried to store
unnecessary blobs in an incorrect manner which overflowed the partition?
(Yes, I am using a separate file system for caching for these
unfortunate overflow cases due to inappropriate setup or bug in caching
software. Served me well over the years. )
I have NOT had the chance to test this incremental compilation feature
extensively because I cleaned up the cache complete before retrying the
tests finally.
But presumably 0.2.8-alpha.0 should be OK since it seems to understand
the incremental compilation.
In any case, sscache 0.2.8-alpha.0 DID build TB successfully after the
cache is cleared.
Morale of the story: Update the binary tools as often as possible (!)
It is a pity that sccache does not print warning about available update
when, say, it is invoked to print statistics, but I may be crying for stars.
TIA
On 2018/08/04 23:11, ISHIKAWA,chiaki wrote:
I configured my local setup NOT to use sccache.
TB built successfully.
So it seems that sccache had a problem.
I noticed that my sccache was old : 0.2.2.
I upgraded it to 0.2.8-alpha.0 from github, and now re-invoking the
local build with sccache enabled and see
if the build succeeds.
Also, I enabled the logging by
SCCACHE_ERROR_LOG=/tmp/t.err
RUST_LOG=debug
before starting sccache server and monitor the LOG output (here it is
/tmp/t.err) to see anything suspicious is occuring.
I did so with 0.2.2 but other than the obscure failure, I did not see
clear picture of what went wrong.
I will report my finding whether with 0.2.8-alpha.0 sccache the build
succeeds, and if there is anything strange I may notice from the log.
TIA
On 2018/08/04 13:51, ISHIKAWA,chiaki wrote:
Hi,
I have been building TB under linux on a local PC.
Has anything been changed in the manner Rust library is compiled in
the last 5-10 days (or since mid-April)?
I have seen the following error for the first time on my PC:
error: failed to execute compile
caused by: error reading compile response from server
caused by: Failed to read response header
caused by: failed to fill whole buffer
error: Could not compile `style`.
To learn more, run the command again with --verbose.
make[4]: *** [/NREF-COMM-CENTRAL/mozilla/config/rules.mk:1006:
force-cargo-library-build] Error 101
make[3]: *** [/NREF-COMM-CENTRAL/mozilla/config/recurse.mk:74:
toolkit/library/rust/target] Error 2
make[3]: *** Waiting for unfinished jobs....
I am not sure what the server is.
Maybe, it could be that I refreshed the sccache, and some rust code
files may get recompiled for the first time in weeks, etc. So I am
not sure if a possible cause is inserted into the source tree in the
last few days (I refreshed my source this morning, and the last time
was a few days ago.)
So the change to trigger this bug easily might have happened in the
last several weeks or even more since I did not update the source
tree of TB together with M-C for a couple of months while I was
working on adapting a large patch set to the latest mid-April source
tree.
Anyway, I suspect some kind of timeout error is to blame.
Is there a way to make the time out longer for this particular
server: I am not sure exactly though what the server is.
I am on a relatively powerful XEON PC and so the CPU power may not
be a big problem (it has not been the case for the last few years),
but it may be I am running out of memory due to other activities on
the said PC (browsing with FF seems to occupy more memory now that it
uses separate processes for different tabs.)
So I suspect excessive paging I noticed sometimes may have caused an
unusual delay of the server, presumably sccache server, or does rustc
supports some type of compilation farm on its own?
Anyway, being able to extend the timeout value should fix the issue,
I hope.
I have tried to invoke the rust compilation time and time again, but
my local script stops after the compilation of C++ files and then
stops at this
error: Could not compile `style`.
repeatedly. Not good :-(
TIA
_______________________________________________
dev-builds mailing list
dev-builds@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-builds
_______________________________________________
dev-builds mailing list
dev-builds@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-builds
_______________________________________________
dev-builds mailing list
dev-builds@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-builds