http://gcc.gnu.org/bugzilla/show_bug.cgi?id=47538
Jakub Jelinek <jakub at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
CC| |jakub at gcc dot gnu.org
Component|c |tree-optimization
--- Comment #8 from Jakub Jelinek <jakub at gcc dot gnu.org> 2011-01-31
09:30:30 UTC ---
Ugh. The problem seems to be that the middle-end considers sizetype and size_t
to be compatible types, thus we have x = y + 1; and similar stmts where x and 1
has sizetype type, but y has size_t. tree-ssa-ccp.c (but similarly other
places e.g. in tree.c) ignore TYPE_UNSIGNED on sizetype, assuming it is 0, so
my fix in tree-ssa-ccp.c results in different value of uns on such stmts.
--- gcc/tree-ssa.c.jj 2011-01-15 11:26:42.000000000 +0100
+++ gcc/tree-ssa.c 2011-01-31 10:16:30.319433380 +0100
@@ -1276,6 +1276,12 @@ useless_type_conversion_p (tree outer_ty
|| TYPE_PRECISION (inner_type) != TYPE_PRECISION (outer_type))
return false;
+ if ((TREE_CODE (inner_type) == INTEGER_TYPE
+ && TYPE_IS_SIZETYPE (inner_type))
+ != (TREE_CODE (outer_type) == INTEGER_TYPE
+ && TYPE_IS_SIZETYPE (outer_type)))
+ return false;
+
/* We don't need to preserve changes in the types minimum or
maximum value in general as these do not generate code
unless the types precisions are different. */
seems to fix the miscompilation, but I'm afraid could have pretty substantial
result on IL size. Why do we need something so ill-designed as sizetype? Just
for Ada?