Sent from my iPhone
On Feb 16, 2010, at 9:37 AM, "0xe2 dot 0x9a dot 0x9b at gmail dot com"
<gcc-bugzi...@gcc.gnu.org> wrote:
------- Comment #5 from 0xe2 dot 0x9a dot 0x9b at gmail dot com
2010-02-16 17:37 -------
(In reply to comment #4)
There is nothing to fix. Your program triggers undefined
behavior. It can do
anything, which can include something you'd expect, or something
completely
different and it can depend on compiler options, position of stars,
etc.
I understand what you are saying, but I do not agree with that. I my
opinion,
an *optimization* option should never result in any change of a
program's
behavior for this particular kind of undefined behaviors. I mean,
there are
basically two different kinds of undefined behaviors:
1. Where the compiler has to choose a *particular* implementation.
Huh, this is the opposite effect of undefined behavior. In fact for
signed interger overflow, gcc sometimes optimizes it as wrapping and
others as clamping. In this case it is clamping. It is hard sometimes
to optimize constiently undefined behavior because of inlining and
other optimizations that can change the ir before the optimization of
undefined behavior.
2. Where the compiler does not choose anything or cannot choose
anything
particular. (For example, what happens if accessing deallocated
memory.)
The conversion test-case is of the 1st kind. Not of the 2nd kind.
GCC -O0
chooses to generate a particular sequence of instructions to
implement the
undefined behavior. GCC -O2 does not respect the choice made at -O0
(or vice
versa).
So, my question is: If it is possible for the problematic code to be
implemented in all contexts by the same operation, and in this case
it indeed
is possible, why is GCC using two different operations? How do you
justify
that?
--
0xe2 dot 0x9a dot 0x9b at gmail dot com changed:
What |Removed |Added
---
---
----------------------------------------------------------------------
Status|RESOLVED |UNCONFIRMED
Resolution|INVALID |
http://gcc.gnu.org/bugzilla/show_bug.cgi?id=43089