https://gcc.gnu.org/bugzilla/show_bug.cgi?id=102314
Jakub Jelinek <jakub at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |burnus at gcc dot gnu.org --- Comment #3 from Jakub Jelinek <jakub at gcc dot gnu.org> --- I think those types: character(kind=1)[1:.y] * y; don't really make sense for the character(:) allocatables, I think much better would be to leave the TYPE_MAX_VALUE from TYPE_DOMAIN out, making it [1:]. [1:.y] nicely expresses the intent, in most places y (if it is not NULL) should point to [1:.y] array. But if we emit a DECL_EXPR for it, it will be just wrong, the middle-end expects the C/C++ VLA behavior, the size is computed once (when encountering the DECL_EXPR), then the VLA is allocated with it and that is the size it has from that point on. The deferred-length (is that the right term) arrays work differently from this though, the length can be changed at any time. So the VLA length is always determined from the current value of the .y variable (except in the short spots where that is updated and the var not yet reallocated).