https://gcc.gnu.org/bugzilla/show_bug.cgi?id=101590
--- Comment #8 from Andrew Pinski <pinskia at gcc dot gnu.org> --- So the test in the end should be: INTEGRAL_TYPE_P (TREE_TYPE (@0)) && tree_expr_nonnegative_p (@0) as it does not matter if @1 has the sign bit set or not. since nonzero & XYZ is still nonzero and in this case we just need to know that `a < b` is not dependent on a sign change happening.