https://gcc.gnu.org/bugzilla/show_bug.cgi?id=103881

Eric Gallager <egallager at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           See Also|                            |https://gcc.gnu.org/bugzill
                   |                            |a/show_bug.cgi?id=40752,
                   |                            |https://gcc.gnu.org/bugzill
                   |                            |a/show_bug.cgi?id=12411
                 CC|                            |egallager at gcc dot gnu.org

--- Comment #5 from Eric Gallager <egallager at gcc dot gnu.org> ---
(In reply to thomas from comment #3)
> Interesting.
> 
> So the difference between "x |= a & a" and "x |= f() & f()" is that the
> latter has passed a somewhat arbitrary level of complexity after which GCC
> is not able to prove that it's safe, and therefore warns as it being
> potentially losing precision?
> 
> It's understandable, but unfortunate. It means that I have no hope of having
> real world programs be free of false positives for conversion warnings.

The latter looks like something that ought to get a -Wsequence-point warning
anyways, at least per bug 12411... but then again that one was closed as
WONTFIX, so never mind...

Reply via email to