https://gcc.gnu.org/bugzilla/show_bug.cgi?id=120704

--- Comment #4 from congli <congli at smail dot nju.edu.cn> ---
By the C standard "The behavior is defined only if both the original pointer
and the result pointer are pointing at elements of the same array or one past
the end of that array", yes, "The only defined value for l is 0 or 1" is
correct, since c is an int* (instead of an object pointer).

So does GCC actually makes some UB-based optimizations based on the assumption
that "&c + l" is definitely undefined after seeing l is neither 0 nor 1?

However, when you change "k + l + 2147634" to "k + (l + 2147634)" in j(), the
issue is still there. This time, (l + 2147634) is definitely 0.

Reply via email to