https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71372

--- Comment #7 from rguenther at suse dot de <rguenther at suse dot de> ---
On Wed, 1 Jun 2016, jakub at gcc dot gnu.org wrote:

> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=71372
> 
> Jakub Jelinek <jakub at gcc dot gnu.org> changed:
> 
>            What    |Removed                     |Added
> ----------------------------------------------------------------------------
>                  CC|                            |jakub at gcc dot gnu.org
> 
> --- Comment #6 from Jakub Jelinek <jakub at gcc dot gnu.org> --- Isn't 
> this generally a problem of the whole folder, not just cp_fold? I mean, 
> if you have say MEM[&MEM[p, CST1], CST2] and the outer MEM_REF has e.g. 
> TREE_THIS_VOLATILE, TREE_THIS_NOTRAP, TREE_SIDE_EFFECTS, TREE_READONLY, 
> TREE_CONSTANT set on it, and you call fold on it, then I don't see what 
> would preserve those bits (not sure if all of them are applicable). I 

True, but that's a bug of that specific folder then.

> see just
>       switch (TREE_CODE_LENGTH (code))
>         {
>         case 1:
>           op0 = TREE_OPERAND (t, 0);
>           tem = fold_unary_loc (loc, code, type, op0);
>           return tem ? tem : expr;
>         case 2:
>           op0 = TREE_OPERAND (t, 0);
>           op1 = TREE_OPERAND (t, 1);
>           tem = fold_binary_loc (loc, code, type, op0, op1);
>           return tem ? tem : expr;
>         case 3:
>           op0 = TREE_OPERAND (t, 0);
>           op1 = TREE_OPERAND (t, 1);
>           op2 = TREE_OPERAND (t, 2);
>           tem = fold_ternary_loc (loc, code, type, op0, op1, op2);
>           return tem ? tem : expr;
> without really trying to preserve anything.  Similarly to this cp_fold has
> similar problem, c_fully_fold* will work more often than cp_fold, because it
> always goes through build instead of fold_build*, followed by copying over of
> TREE_THIS_VOLATILE and various other flags, and only then calls fold on the
> whole result, so just for the tem != NULL cases above can drop the flags, but
> in cp_fold it always calls fold_build* without copying flags over.
> 
> Now, the question is, if what we should do here and cp_fold is not fold at all
> for TREE_THIS_VOLATILE (in cp_fold then just build and copy over flags), or
> fold, check if the result of the folding is still e.g. tcc_reference, and just
> set the flag on the result if it has been set before.

Definitely set the flag on the result if it has been set before.  Not
fold the dereference itself as well I guess.

Note the

    case MEM_REF:
      /* MEM[&MEM[p, CST1], CST2] -> MEM[p, CST1 + CST2].  */
      if (TREE_CODE (arg0) == ADDR_EXPR
          && TREE_CODE (TREE_OPERAND (arg0, 0)) == MEM_REF)
        {
          tree iref = TREE_OPERAND (arg0, 0);
          return fold_build2 (MEM_REF, type,
                              TREE_OPERAND (iref, 0),
                              int_const_binop (PLUS_EXPR, arg1,
                                               TREE_OPERAND (iref, 1)));
        }

foldings that miss the volatile is a canonicalization one, turning
invalid MEM_REFs into valid ones.  I'll fix those up.  They also
miss to copy TREE_THIS_NOTRAP, etc. - see tree-ssa-forwprop.c who
does this correctly for example.

Reply via email to