https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91191
Andrew Macleod <amacleod at redhat dot com> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |aldyh at redhat dot com, | |jason at redhat dot com, | |law at redhat dot com --- Comment #4 from Andrew Macleod <amacleod at redhat dot com> --- Its unclear to me what should be done here. the documentation for VIEW_CONVERT is quite clear in tree.def: "The only operand is the value to be viewed as being of another type. It is undefined if the type of the input and of the expression have different sizes." I realized range-ops isn't currently doing anything with VIEW_CONVERT, but when I tried changing this test case to similar precisions, the VIEW_CONVERT goes away. My knowledge of VIEW_CONVERT is, well, poor to non-existent. so 1) Why are we issuing a VIEW_CONVERT in the first place? They are different precision's and this appears to break the very definition 2) when they are the same precision, there wouldn't need to be a sign extension, so how does it vary from a cast? 1) and 2) seem incongruent to me. Either you confirm they are the same size and do a cast, or they are different precisions and you want to avoid the normal cast behaviour. Or has this got something to do with a difference in precision but understanding the the underlying storage is actually the same size? So we only issue a VIEW_CONVERT when precision is different but storage is the same size? so you can make certain assumption about the value? but that seem wrought with issues too...