------- Additional Comments From bangerth at dealii dot org 2005-01-09 22:48 ------- I also wonder what the semantics are that you are expecting? I mean, you try to allocate an array that is so large that you can't address the individual bytes using a size_t, in other words one that is larger than the address space the OS provides to your program. That clearly doesn't make any sense. That being said, I understand that the behavior of this is security relevant and that any attempt to allocate more memory than is available will necessarily have to fail. Thus, our present implementation isn't standards conforming, since it returns a reasonable pointer even in the case that the allocation should have failed. I would therefore agree that this is a problem, and given that memory allocation is an expensive operation, one overflow check isn't really time critical. Unfortunately, the situation is not restricted to libstdc++'s implementation of operator new[], since that operator only gets the total size of the memory to be allocation, not the size per element and the number of elements. Therefore, by the time we get into the implementation of this operator, it is already too late. In other words, the overflow check has to happen in compiler-generated code, not in the libstdc++ implementation. I would support the introduction of such code, if necessary guarded by some flag, or unconditionally, as a matter of quality of implemetation. W.
-- What |Removed |Added ---------------------------------------------------------------------------- Status|UNCONFIRMED |NEW Ever Confirmed| |1 Last reconfirmed|0000-00-00 00:00:00 |2005-01-09 22:48:34 date| | http://gcc.gnu.org/bugzilla/show_bug.cgi?id=19351