http://gcc.gnu.org/bugzilla/show_bug.cgi?id=52119
--- Comment #11 from Jason Merrill <jason at gcc dot gnu.org> --- (In reply to ajf from comment #10) > MY apologies if I am missing something obvious, but what exactly makes > shifting a 1 into the sign bit undefined behavior? You're right, it isn't: 5.8/2: ... if E1 has a signed type and non-negative value, and E1 × 2^E2 is representable in the corresponding unsigned type of the result type, then that value, converted to the result type, is the resulting value ....