J Decker <d3c...@gmail.com> writes:

> Can the severity of signed/unsigned comparisons be raised, since GCC
> does not properly handle the comparisons.

GCC properly handles the comparisons according to the rules laid down in
the C/C++ language standards.


> int main()
>
> {
>       int s = -2;
>       unsigned int u = 0xFFFFFFFDU;
>
>       if( s < u )
>               printf( "okay\n" );
>       else
>               printf( "incorrect handling\n" ); // gets here

The language standards say that s is promoted from int to unsigned int
(C99 6.3.1.8).  The conversion from int to unsigned int produces
0xfffffffe (assuming int is 32 bits) (C99 6.3.1.3).  0xfffffffe <
0xfffffffd is false.


>       {
>               int i = -2;
>               unsigned u = 2;
>               if (i < u) {
>                       // Does GCC get here? no, it doesn't
>                       printf( "Ya, of course\n" );
>               }
>               else
>                       printf( "gcc puked\n" ); // gets here
>       }

Here the comparison is between 0xfffffffe and 2.


>    {
>
>       unsigned int i = -3;
>        if( i < 0 )
>                printf( "this can't happen, it's unsigned.\n" );
>        else
>               printf( "this is actually a correct result here\n" ); // does 
> get this
>       }

An unsigned integer is always >= 0, by definition.


>    {
>
>     int i = -3;
> // visual studio does not warn on this one either... just the first
> two comparisons
>     if( i < (unsigned)0 )
>                printf( "-3 is < 0 \n" );
>        else
>               printf( "-3 is more than 0?\n" ); // gets here
>       }
>
>       return 0;
>
> }

Same here.


> I don't know why standards left this open, other than there isn't a
> single-instruction translation from code to CPU for the comparison;

The standards did not leave this open.  They define precisely what is
supposed to happen.


> But if it's not fixed, this warning should definatly be issued at
> default warning level.  This should be more like 'if this comparison
> can be wrong, it will be wrong'.

There is no problem comparing values of signed type with values of
unsigned type if the signed values are known to be nonnegative.  Of
course it is sometimes hard to know that; hence the warning.  But
enabling the warning by default does not make sense.

Ian

Reply via email to