https://gcc.gnu.org/bugzilla/show_bug.cgi?id=90248

--- Comment #12 from rguenther at suse dot de <rguenther at suse dot de> ---
On Tue, 14 May 2019, pinskia at gcc dot gnu.org wrote:

> https://gcc.gnu.org/bugzilla/show_bug.cgi?id=90248
> 
> --- Comment #10 from Andrew Pinski <pinskia at gcc dot gnu.org> ---
> So I am trying to understand, the semantics here.
> HONOR_SIGNED_ZEROS says -0.0 won't exist or that the sign of -0.0 and 0.0 
> don't
> matter? and what are the semantics if -0.0 shows up?
> 
> If we treat -0.0 as 0.0, then all of the optimizations here should be removed.
> If we treat -0.0 as undefined behavior if it shows up, then only two of the
> lines need to be removed.
> 
> Which is correct semantics for HONOR_SIGNED_ZEROS?

The sign of a zero _result_ is undetermined.  It doesn't say anything
about signs of zero input values.  That's contrary to
HONOR_NANS/INFS (and -ffinite-math-only).

Reply via email to