https://gcc.gnu.org/bugzilla/show_bug.cgi?id=85315

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jakub at gcc dot gnu.org

--- Comment #13 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
(In reply to Andrew Macleod from comment #12)
> Maybe I'm a little dense.
> 
> if we are presuming that  
>   &x + (a + b) 
> implies a + b == 0, then we also should assume that

&x + (a + b) for scalar x doesn't imply a + b == 0, it implies a + b <= 1.
Only when it is dereferenced, i.e. (&x)[a + b] is accessed a + b has to be 0.

Reply via email to