http://gcc.gnu.org/bugzilla/show_bug.cgi?id=59128

ktkachov at gcc dot gnu.org changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
             Status|UNCONFIRMED                 |RESOLVED
                 CC|                            |ktkachov at gcc dot gnu.org
         Resolution|---                         |INVALID
             Status|UNCONFIRMED                 |RESOLVED
         Resolution|---                         |INVALID

--- Comment #1 from ktkachov at gcc dot gnu.org ---
That's expected behaviour.
ALPHA2 expands to 10.*10.

f47/ALPHA2 is then 100.0 / 10.0 * 10.0

The * and / operators bind from left to right, therefore this is evaluated as
(100.0 / 10.0) * 10.0 = 100.0

That's why it's usually good practice to put parentheses in your #defines:

#define ALPHA2 ((ALPHA) * (ALPHA))

--- Comment #2 from Marc Glisse <glisse at gcc dot gnu.org> ---
> #define ALPHA = 10.

No = there.

> #define ALPHA2 ALPHA*ALPHA

You forgot parentheses.

This has nothing to do with gcc. Look at the output of gcc ZED3.c -E and try to
understand why your code is wrong.

Reply via email to