https://gcc.gnu.org/bugzilla/show_bug.cgi?id=115829

            Bug ID: 115829
           Summary: ((x & 1345) ^ 2048) - 2048 is not optimized to just (x
                    & 1345)
           Product: gcc
           Version: 15.0
            Status: UNCONFIRMED
          Keywords: missed-optimization
          Severity: enhancement
          Priority: P3
         Component: tree-optimization
          Assignee: unassigned at gcc dot gnu.org
          Reporter: pinskia at gcc dot gnu.org
  Target Milestone: ---

Take:
```
unsigned f0(unsigned x)
{
 return ((x & 1345u) ^ 2048u) - 2048u;
}
unsigned f1(unsigned x)
{
 return ((x & 1345u) | 2048u) - 2048u;
}
unsigned f2(unsigned x)
{
 return ((x & 1345u) + 2048u) - 2048u;
}
```

These all should be optimized to the same `(x & 1345u)` but f0 is not
currently.

This is forwarded from
https://hachyderm.io/@cfbolz@mastodon.social/112723957472456010 .

In the first case maybe changing xor to ior will fix it. There might be another
bug about handling that ...

Reply via email to