https://gcc.gnu.org/bugzilla/show_bug.cgi?id=88993
Martin Sebor <msebor at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- Keywords| |diagnostic --- Comment #2 from Martin Sebor <msebor at gcc dot gnu.org> --- Quoting my response in a private discussion: The warning was prompted by a limitation in Glibc(*) where its sprintf would fail for some directives that produced more output than 4k. I don't know if any other implementations have similar limitations and the warning doesn't try to adjust to specific implementations. The warning is handled by a pass that both detects possible bugs and optimizes calls to formatted I/O functions. GCC needs to avoid optimizing results of calls to these functions on the assumption that the library calls succeed under these conditions. So the warning is both a reminder that the code may be unportable (i.e., it could fail and result in unexpected truncation of output) and less efficient. This is more relevant to the string I/O functions like sprintf than to file I/O where output truncation may happen under all sorts of conditions, and that GCC doesn't for the most part attempt to optimize. At the same time, level 2 of the -Wformat-overflow warning is designed to be strict even at the cost of some false positives. When the warning doesn't know whether a call is safe, at level 2 it errs on the side of caution and triggers. There are many other cases when the warning is issued even for safe code, so this one doesn't seem any worse to me. But if it is causing hardship (e.g., too many instances) we might want to think about adjusting it somehow. (For instance, we could limit the warning to the string formatting functions and avoid issuing it for printf/fprintf; or we could only issue it when -Wpedantic is also specified; or we could add a new warning, say -Wportability, for these kinds of problems.) The Glibc limitation/bug still exists in current versions: https://bugzilla.redhat.com/show_bug.cgi?id=441945 https://sourceware.org/bugzilla/show_bug.cgi?id=21127 It affects directives that produce lots of output, usually because of very large width or precision. In very simple tests the threshold where it starts to dynamically allocate memory is around 64k, like in this call: snprintf (0, 0, "%.65505i", 1); or this one: snprintf (0, 0, "%65505s", ""); Depending on when malloc fails, either the call fails or crashes.