https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
Andrew Pinski changed:
What|Removed |Added
CC||davidfromonline at gmail dot
com
--- Co
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
Andrew Pinski changed:
What|Removed |Added
Target Milestone|--- |13.0
Resolution|---
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
--- Comment #7 from Gabriel Ravier ---
I don't know if I've missed something obvious but this still appears to be
fixed.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
--- Comment #6 from Gabriel Ravier ---
Can confirm that this appears to be fixed.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
--- Comment #5 from CVS Commits ---
The master branch has been updated by Jakub Jelinek :
https://gcc.gnu.org/g:ab981aab92cbc71918fbaadcf6fa64bdb2b69be7
commit r13-1187-gab981aab92cbc71918fbaadcf6fa64bdb2b69be7
Author: Arjun Shankar
Date: Tu
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
rsandifo at gcc dot gnu.org changed:
What|Removed |Added
Keywords||easyhack
Statu
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
--- Comment #3 from Andrew Pinski ---
If I used (int)(0x8000) instead, I get the optimization which means GCC is
correct.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
Andrew Pinski changed:
What|Removed |Added
Status|UNCONFIRMED |RESOLVED
Resolution|---
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94899
--- Comment #1 from Andrew Pinski ---
The problem is only with INT_MIN.