dy care to explain it to me why it behaves in such a
way?
Thank you in advance.
Regards,
Ang Way Chuang
Andrew Pinski wrote:
On Tue, Apr 29, 2008 at 8:50 PM, Ang Way Chuang <[EMAIL PROTECTED]> wrote:
abc.a = abc.a++ % abc.b;
You are assigning to abc.a twice without a sequence point inbetween so
this code is undefined as the order of evaluation of expressions
without a sequence po
Andrew Pinski wrote:
On Tue, Apr 29, 2008 at 9:08 PM, Ang Way Chuang <[EMAIL PROTECTED]> wrote:
Thanks for the speedy reply. But why this code:
int a = 17, b = 16;
a = a++ % 16;
results in a = 2 then? I think I need to know what is sequence point. I'll
google t
Ang Way Chuang wrote:
Andrew Pinski wrote:
On Tue, Apr 29, 2008 at 9:08 PM, Ang Way Chuang <[EMAIL PROTECTED]> wrote:
Thanks for the speedy reply. But why this code:
int a = 17, b = 16;
a = a++ % 16;
results in a = 2 then? I think I need to know what is sequence
Paolo Bonzini wrote:
Ang Way Chuang wrote:
Ang Way Chuang wrote:
Andrew Pinski wrote:
On Tue, Apr 29, 2008 at 9:08 PM, Ang Way Chuang <[EMAIL PROTECTED]> wrote:
Thanks for the speedy reply. But why this code:
int a = 17, b = 16;
a = a++ % 16;
results in a = 2 t
Paolo Bonzini wrote:
Thanks for the speedy reply. But why this code:
int a = 17, b = 16;
a = a++ % 16;
Huh? Now you got me confused. Since it is an undefined behaviour, gcc
is free to whatever it likes.
Sure, but if you ask gcc to signal a warning, it is supposed to do so.