I noticed that there is a bunch of testcases in gcc.dg/tree-ssa
(slsr-27.c, for e.g.) that assume that the size of the integer is 4
bytes. For example, slsr-27.c has
struct x
{
int a[16];
int b[16];
int c[16];
};
and
void
f (struct x *p, unsigned int n)
{
foo (p->a[n], p->c[n], p->b[n]);
}
and expects a "* 4" to be present in the dump, assuming the size of an int to
be 4 bytes (n * 4 gives the array offset).
What is right way to fix these? I saw one testcase that did
typedef int int32_t __attribute__ ((__mode__ (__SI__)));
and used int32_t everywhere where a 32 bit int is assumed. Is this the right
way to go? Or maybe some preprocessor magic that replaces the "int" token with
one that has the SI attribute? Or should the test assertion be branched for
differing sizes of int?
Regards
Senthil