https://gcc.gnu.org/bugzilla/show_bug.cgi?id=90248

--- Comment #10 from Andrew Pinski <pinskia at gcc dot gnu.org> ---
So I am trying to understand, the semantics here.
HONOR_SIGNED_ZEROS says -0.0 won't exist or that the sign of -0.0 and 0.0 don't
matter? and what are the semantics if -0.0 shows up?

If we treat -0.0 as 0.0, then all of the optimizations here should be removed.
If we treat -0.0 as undefined behavior if it shows up, then only two of the
lines need to be removed.

Which is correct semantics for HONOR_SIGNED_ZEROS?

Reply via email to