https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98556
--- Comment #8 from CVS Commits <cvs-commit at gcc dot gnu.org> --- The releases/gcc-10 branch has been updated by Jakub Jelinek <ja...@gcc.gnu.org>: https://gcc.gnu.org/g:0188eab844eacda5edc6257771edb771844ae069 commit r10-9242-g0188eab844eacda5edc6257771edb771844ae069 Author: Jakub Jelinek <ja...@redhat.com> Date: Sat Jan 9 10:49:38 2021 +0100 tree-cfg: Allow enum types as result of POINTER_DIFF_EXPR [PR98556] As conversions between signed integers and signed enums with the same precision are useless in GIMPLE, it seems strange that we require that POINTER_DIFF_EXPR result must be INTEGER_TYPE. If we really wanted to require that, we'd need to change the gimplifier to ensure that, which it isn't the case on the following testcase. What is going on during the gimplification is that when we have the (enum T) (p - q) cast, it is stripped through /* Strip away as many useless type conversions as possible at the toplevel. */ STRIP_USELESS_TYPE_CONVERSION (*expr_p); and when the MODIFY_EXPR is gimplified, the *to_p has enum T type, while *from_p has intptr_t type and as there is no conversion in between, we just create GIMPLE_ASSIGN from that. 2021-01-09 Jakub Jelinek <ja...@redhat.com> PR c++/98556 * tree-cfg.c (verify_gimple_assign_binary): Allow lhs of POINTER_DIFF_EXPR to be any integral type. * c-c++-common/pr98556.c: New test. (cherry picked from commit 991656092f78eeab2a48fdbacf4e1f08567badaf)