https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68065
--- Comment #10 from Alexander Cherepanov <ch3root at openwall dot com> --- On 2015-10-27 20:09, joseph at codesourcery dot com wrote: > I think it's undefined at the point where a type exceeds the limit on the > size of an object This would probably be the most reasonable approach but it's not clear if the text of the standard supports it. E.g., the list of UB (C11, J.2p1) includes this one: - The size expression in an array declaration is not a constant expression and evaluates at program execution time to a nonpositive value (6.7.6.2). but I don't see anything like what you described. Perhaps I'm missing something? > (half the address space minus one byte), And this particular value for the limit on the size of an object is troublesome because it's completely undocumented (AFAICT) and goes against what the standard hints to (whole idea of size_t becomes useless, right?). It is also not supported by glibc (malloc), can lead to vulnerabilities etc. etc. but this discussion is for https://gcc.gnu.org/bugzilla/show_bug.cgi?id=67999 . > whether or not > sizeof is used or any object with that type is constructed - that is, as > soon as the language semantics involve evaluation of the array sizes for > the VLA type in question. (If the sizes are neither evaluated nor > required, e.g. sizeof (int (*)[size]), or when replaced by [*] at function > prototype scope, I don't consider that undefined; if required but not > evaluated, as in certain obscure cases of conditional expressions, that's > a different case of undefined behavior.) This is also very nice approach. (Although it seems to differ from the approach to non-VLA arrays in https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68107#c1 .) But, again, I don't see how to tie it to the standard. It doesn't mean that this approach is wrong, the standard probably just lacks necessary rules. But it would be nice to make it clear which rules are from the standard and which are gcc's own.