Hi,
We're having a problem with a simple switch command, in which cilly
seems to turn an char literal into a signed value, although cilly is
compiled with char being unsigned.
Included is the original program, the preprocessor output and the output
after cilly.
As you can see, cilly has turn
Hi,
We have a piece of code with a global var. looking like :
tSiaTableField tmnxRadProxSrvTableLeafOvr[] =
{
{ .leafId = LEAF_tmnxRadProxSrvRowStatus , .node.ranges =
ROWSTATUS_RESTRICTED, },
{ .leafId = LEAF_tmnxRadProxSrvAdminState, .node.ranges =
ADMINSTATE_NO_NOOP, },
{ }
Maybe it's because we're still on 1.3.6?
I didn't see this mentioned in the 1.3.6->1.3.7 change log.
Thanks,
Dany
Christoph Spiel schreef:
Hi Dany!
On Tue, Apr 27, 2010 at 08:42:05AM +0200, Dany Vereertbrugghen wrote:
We're currently running into a problem where
Hi All,
We're currently running into a problem where CIL seems to transform
(mainValue < (0xULL)/1000)
to
(mainValue < 0ULL)
This problem seems to be tracked in bug 2815129
Does anyone have a fix for this?
Thanks,
Dany
Hi all,
I'm not sure if this behavior is undefined, or if it is a bug in CIL,
but CIL seems to behave different from gcc.
For following function (which might be the result of a macro
substitution), i seems to be 5 for gcc in the inner block, while running
it through CIL would not compile (ge