https://gcc.gnu.org/bugzilla/show_bug.cgi?id=103628

HaoChen Gui <guihaoc at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Assignee|unassigned at gcc dot gnu.org      |guihaoc at gcc dot 
gnu.org
                 CC|                            |guihaoc at gcc dot gnu.org

--- Comment #5 from HaoChen Gui <guihaoc at gcc dot gnu.org> ---
The memory representation of IBM long double is not unique. It's actually the
sum of two 64-bit doubles. 

During decoding, the real variable b can be 
b = = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108357, sig = {0, 0, 9295712554570040320}}
which is sum of following two doubles
u = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108356, sig = {0, 0, 9295712899447228416}}
v = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108356, sig = {0, 0, 9295712209692852224}}

During encoding, the real variable b can be
b = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108357, sig = {0, 0, 9295712554570040320}}
which is splited to following two doubles
u = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108357, sig = {0, 0, 9295712554570039296}}
v = {cl = 1, decimal = 0, sign = 0, signalling = 0, canonical = 0,
  uexp = 67108304, sig = {0, 0, 9223372036854775808}}

After decoding and encoding, the memory representation changes. After PR95450
added a verification of decoding/encoding check, native_interpret_expr returns
a NULL tree for this case which causes ICE.

Shall we disable Hollerith constant for IBM long double(-mabi=ibmlongdouble)?
Or just throw it to upper layer and let parser report an error? Please advice.

Reply via email to