Except that host tools (fastjar mostly) are made with the *new* GCC
rather than the old one.
And the reason is what? I don't see even any theoretical merit in the
whole staging thing:
1. Bugs can theoretically cancel them self out.
2. The compiler isn't stressing himself more then the target library
stuff, which gets build anyway.
3. Why on earth should I insist to work with s self compiled gcc,
when the native compiler is generating better and faster code?
4. Release builds for platforms are self hosted anyway so what?
5. For the case of cross compilation the whole reasoning behind it
falls apart like a home build from cards.
But there is one thing i see for certain: It takes insane amounts of
time, since the compiler is usually
building itself with the EXTRA SLOW version of itself containing
runtime assertions. And thus:
6. Longer build cycles (in esp. when overlapping with the sleep
cycle) result in less productive time and
thus likely *less* actual functional testing of the resulting
compiler when focusing on a particular single
property of it - which is the working mode of 99% of the people here
I assume.
Testing the resulting compiler has already a name:
make test
Everything else is just adding to confusion.