http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59778

--- Comment #3 from joseph at codesourcery dot com <joseph at codesourcery dot 
com> ---
On Mon, 14 Apr 2014, danglin at gcc dot gnu.org wrote:

> I tend to think the failure of this test on hppa is mainly a
> testsuite issue.  Specifically, the test performs operations that may
> raise exceptions and the result of these operations are tested:

The test expects IEEE semantics for arithmetic results and exceptions (in 
round-to-nearest).

> The PA-RISC 2.0 Architecture states that the destination is undefined
> except in the case when it is one of the source registers for invalid
> and divide by zero.  It also appears based on my testing that the results
> are not as expected by the test for overflow, underflow and inexact
> operations.

An undefined result hardly makes sense for "inexact", at least, even if 
the processor isn't claiming IEEE floating point....

First, the TARGET_ATOMIC_ASSIGN_EXPAND_FENV hook needs implementing; this 
test is always expected to fail for platforms not defining that hook.  
Without that hook, results not as expected can simply arise from the 
operation not being properly atomic regarding floating-point exceptions.

Then, if you still see failures with that hook defined, if there is an 
option such as -mieee to enable IEEE semantics (e.g. through enabling 
traps to the kernel with kernel emulation for the problem cases), then it 
would be appropriate to use it with dg-additional-options; otherwise, the 
fenv_exceptions effective-target may need adjusting to exclude the problem 
targets.  But the first step is defining the hook.

Reply via email to