https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95582
--- Comment #7 from Richard Biener <rguenth at gcc dot gnu.org> ---
So Ada does
/* In Ada, we use an unsigned 8-bit type for the default boolean type. */
boolean_type_node = make_unsigned_type (8);
TREE_SET_CODE (boolean_type_node, BOOLEAN_TYPE);
but somehow in lto1 (or via pulling in C code into the LTRANS function?)
lto1 boolean_type_node is the 1-bit one. So that's probably introducing
the mismatch (and eventually missed optimizations with -flto and Ada).
We're excempting boolean type from special streaming correctly:
/* Skip boolean type and constants, they are frontend dependent. */
if (i != TI_BOOLEAN_TYPE
&& i != TI_BOOLEAN_FALSE
&& i != TI_BOOLEAN_TRUE
but of course then all middle-end (at lto1 time) generated expressions
with boolean type get to use the lto1 boolean_type_node which matches
that of the C frontend.
So a testcase will need both Ada and LTO to trigger the issue.