Sample source:

public class t
{
  public static final int[] x = { 5, 7, 9, 11 };
}

In gcc 3.3 (and 3.4 and earlier than 3.3), this code compiled
to something like:

.LJv0.0:
        .long   _Jv_intVTable
        .long   4
        .long   5
        .long   7
        .long   9
        .long   11

Now we actually emit code for this.

The culprits are a combination of this patch to gcj:

2004-07-08  Richard Henderson  <[EMAIL PROTECTED]>

        * expr.c (case_identity, get_primitive_array_vtable,
        java_expand_expr, emit_init_test_initialization): Remove.
        * java-tree.h (java_expand_expr): Remove.
        * lang.c (LANG_HOOKS_EXPAND_EXPR): Remove.

and this patch to libgcj (which removed the primitive vtables):

2004-07-23  Bryce McKinlay  <[EMAIL PROTECTED]>

        * prims.cc (_Jv_InitPrimClass): Don't create an array class.
        (_Jv_CreateJavaVM): Don't pass array vtable parameter to
        _Jv_InitPrimClass.
        (DECLARE_PRIM_TYPE): Don't declare array vtables.
        * include/jvm.h (struct _Jv_ArrayVTable): Removed.
        * java/lang/Class.h (_Jv_InitPrimClass): Update friend declaration.

-- 
           Summary: [4.0 regression] primitive array optimization is gone
           Product: gcc
           Version: 4.0.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P2
         Component: java
        AssignedTo: unassigned at gcc dot gnu dot org
        ReportedBy: tromey at gcc dot gnu dot org
                CC: gcc-bugs at gcc dot gnu dot org,java-prs at gcc dot gnu
                    dot org


http://gcc.gnu.org/bugzilla/show_bug.cgi?id=18190

Reply via email to