As it turns out, this bug depended on vector allocation size.
xref_basetypes asks for space for a certain number of bases, then fills
up the vector, and assumes that if there's any space left we must have
hit a shared virtual base, i.e. diamond-shaped inheritance. But since
my vec.c patch for PR 14179, when we allocate a vector we get back a
vector with the number of slots that fill the memory we get from the
allocator, rather than the exact number of slots we asked for. So if
those don't match, the compiler thinks that we're sharing.
Fixed by directly comparing the number of slots filled to the number of
slots expected. Tested x86_64-pc-linux-gnu, applying to trunk.
commit 8860243111f961b8f1d37238cd28ed82516b1927
Author: Jason Merrill <ja...@redhat.com>
Date: Tue Jan 24 22:05:52 2012 -0500
PR c++/51917
* decl.c (xref_basetypes): Check VEC_length instead of VEC_space.
diff --git a/gcc/cp/decl.c b/gcc/cp/decl.c
index ef43dbf..7fba04a 100644
--- a/gcc/cp/decl.c
+++ b/gcc/cp/decl.c
@@ -11916,8 +11916,8 @@ xref_basetypes (tree ref, tree base_list)
BINFO_BASE_ACCESS_APPEND (binfo, access);
}
- if (VEC_space (tree, CLASSTYPE_VBASECLASSES (ref), 1))
- /* If we have space in the vbase vector, we must have shared at
+ if (VEC_length (tree, CLASSTYPE_VBASECLASSES (ref)) < max_vbases)
+ /* If we didn't get max_vbases vbases, we must have shared at
least one of them, and are therefore diamond shaped. */
CLASSTYPE_DIAMOND_SHAPED_P (ref) = 1;