On Mon, Nov 03, 2014 at 09:47:05PM +0100, Marek Polacek wrote:
> @@ -1456,6 +1457,8 @@ instrument_object_size (gimple_stmt_iterator *gsi, bool
> is_lhs)
> }
> break;
> case ARRAY_REF:
> + index = TREE_OPERAND (t, 1);
> + break;
> case INDIRECT_REF:
> case MEM_REF:
> case VAR_DECL:
In theory, perhaps you could check the offset from get_inner_reference
instead, and look for SSA_NAME set to BIT_AND_EXPR there instead, to handle
cases like
struct S { int a, b; } a[1024];
...
int j = a[i & 1023].b;
etc. - you have size of the base already computed, if the constant part
of offset and bitoffset together is less than that size and the variable
part of it is SSA_NAME set to BIT_AND_EXPR with non-negative const smaller
than size - (constant part of offset + (bitoffset + BITS_PER_UNIT - 1) /
BITS_PER_UNIT + access size), optimize the check away. But perhaps it can
wait for later.
> @@ -1537,6 +1540,24 @@ instrument_object_size (gimple_stmt_iterator *gsi,
> bool is_lhs)
> && tree_int_cst_le (t, sizet))
> return;
>
> + if (index != NULL_TREE
> + && TREE_CODE (index) == SSA_NAME
> + && TREE_CODE (sizet) == INTEGER_CST)
> + {
> + gimple def = SSA_NAME_DEF_STMT (index);
> + if (is_gimple_assign (def)
> + && gimple_assign_rhs_code (def) == BIT_AND_EXPR
> + && TREE_CODE (gimple_assign_rhs2 (def)) == INTEGER_CST)
> + {
> + tree cst = gimple_assign_rhs2 (def);
> + tree sz = fold_build2 (EXACT_DIV_EXPR, sizetype, sizet,
> + TYPE_SIZE_UNIT (type));
> + if (tree_int_cst_sgn (cst) >= 0
> + && tree_int_cst_lt (cst, sz))
> + return;
> + }
> + }
> +
> /* Nope. Emit the check. */
> t = force_gimple_operand_gsi (gsi, t, true, NULL_TREE, true,
> GSI_SAME_STMT);
>
Ok, thanks.
Jakub