I'm using the special GCC that Apple supplies with
Mac OS X 10.4.7 (PowerPC) and XCode 2.4.  I just ran
a test case with code like:

std::deque<int>  d;
assert( d.max_size() <= d.get_allocator().max_size() );

And it failed.  I looked at the definition of deque::max_size
and saw that it blindly return size_type(-1).  This is actually
greater than what the allocator itself returns (hence the
error).  Then I looked again and saw that std::allocator does
the same thing, except that it divides the max value by
sizeof(value_type).

Is there a legitimate reason for a deque to return a larger
max_size than its allocator?  Even if so, did you guys consider
that, or did you just blindly put in the maximum value?  Does
the max_size for the standard allocator have the same issues?

Can there be some thought put into what std::allocator::max_size()
should return?  Maybe it can be based on how much memory the
program can actually access.  Even if you don't do that, all the standard
container types/tempates should base their max_size() method on
their this->get_allocator().max_size() return value.


-- 
           Summary: Has there been a serious attempt to define the
                    max_size() member functions?
           Product: gcc
           Version: 4.0.1
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: libstdc++
        AssignedTo: unassigned at gcc dot gnu dot org
        ReportedBy: dwalker07 at snet dot net


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=29134

Reply via email to