https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
--- Comment #5 from Gabriel Ravier ---
Also, as an extra note, w.r.t.
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911#c3, I've just noticed that I
had indeed made a separate bug report at https://gcc.gnu.org/PR94912 (which
ended up being close
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
Andrew Pinski changed:
What|Removed |Added
Last reconfirmed||2023-05-13
Ever confirmed|0
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
--- Comment #3 from Marc Glisse ---
Since VLA is an extension for compatibility with C, it is strange that it
behaves differently (does one use the value of n at the time of the typedef and
the other at the time of the declaration?). This bug is
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
--- Comment #2 from Andrew Pinski ---
(In reply to Marc Glisse from comment #1)
> gcc computes sizeof(a) as 4ul*(size_t)n, and unsigned types don't provide
> nice overflow guarantees, so that complicates things.
While the C++ front-end does:
(si
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
Andrew Pinski changed:
What|Removed |Added
Severity|normal |enhancement
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=94911
--- Comment #1 from Marc Glisse ---
gcc computes sizeof(a) as 4ul*(size_t)n, and unsigned types don't provide nice
overflow guarantees, so that complicates things.