https://gcc.gnu.org/bugzilla/show_bug.cgi?id=118837

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jakub at gcc dot gnu.org,
                   |                            |jason at gcc dot gnu.org,
                   |                            |mark at gcc dot gnu.org

--- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
What GCC does right now is that at least when emitting the attributes from RTL
we don't know the sign nor precision, all we know is the value as 64-bit signed
integer.
Now, normally in RTL values are sign extended.  If the 64-bit value is
negative, we emit DW_FORM_sdata, otherwise we emit whatever is smallest
representation of the positive value (DW_FORM_data{1,2,4,8}, could perhaps use
DW_FORM_udata).
So, if we want to change anything, we'd need to track at least for the
dw_val_class_unsigned_const case track not just the value but also precision
and whether it is signed or not.  Now, if precision is the same as
constant_size (AT_unsigned (a)) * BITS_PER_UNIT, we don't really care about
extension and we can keep doing what we are, similarly if the MSB bit in
whatever constant_size we choose is clear, similarly if we know that the
corresponding type is unsigned.  We know the precision and signedness when we
see trees, but not when we just see RTL.
And perhaps we could also try to optimize the DW_FORM_sdata cases if there is
no ambiguity (e.g. for 8-bit signed contexts with negative value DW_FORM_data1
could be unambiguous).

Reply via email to