On Wed, Nov 21, 2012 at 9:38 PM, Jakub Jelinek wrote:
> Hi!
>
> If a type has 2 * HWI precision, sizem1 is maximum double_int (all ones)
> and thus size = sizem1 + double_int_one overflows into 0. If either min0
> or min1 is also zero, we might wrongly canonicalize the range into a signed
> one.
Hi!
If a type has 2 * HWI precision, sizem1 is maximum double_int (all ones)
and thus size = sizem1 + double_int_one overflows into 0. If either min0
or min1 is also zero, we might wrongly canonicalize the range into a signed
one. Fixed thusly, bootstrapped/regtested on x86_64-linux and i686-lin