https://gcc.gnu.org/bugzilla/show_bug.cgi?id=108695
--- Comment #15 from Martin Liška <marxin at gcc dot gnu.org> ---
(In reply to Jakub Jelinek from comment #14)
> (In reply to Martin Liška from comment #10)
> > > where the XOR16 is implemented as:
> > >
> > > #define XORN(in1,in2,out,len) \
> > > do { \
> > > uint _i; \
> > > for (_i = 0; _i < len/sizeof(ulong); ++_i) \
> > > *((ulong*)(out)+_i) = *((ulong*)(in1)+_i) ^
> > > *((ulong*)(in2)+_i); \
> > > } while(0)
> >
> > I can confirm that changing that to:
> >
> > #define XORN(in1, in2, out, len) \
> > do \
> > { \
> > uint _i; \
> > for (_i = 0; _i < len; ++_i) \
> > *(out + _i) = *(in1 + _i) ^ *(in2 + _i); \
> > } while (0)
> >
> > fixes the problem. It seems very close to what I saw here:
> > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=83201#c13
>
> It depends on if those arrays were stored as ulong or will be later read as
> ulong or something else.
Yes, that's what happens, they are stored with the aforementioned XORN function
as ulong types. And later are read with
#define GETU32(p) ntohl(*((u32*)(p)))
as u32. That's the aliasing violation.