https://gcc.gnu.org/bugzilla/show_bug.cgi?id=111537

--- Comment #5 from David Malcolm <dmalcolm at gcc dot gnu.org> ---
It's complaining about the read from the string literal.

If I change the string in the reproducer from "hello world" to "foo", I see:

(gdb) pt string_cst
 <string_cst 0x7fffea76f738
    type <array_type 0x7fffea78a150
        type <integer_type 0x7fffea772930 char readonly unsigned QI
            size <integer_cst 0x7fffea644eb8 constant 8>
            unit-size <integer_cst 0x7fffea644ed0 constant 1>
            align:8 warn_if_not_align:0 symtab:0 alias-set -1 canonical-type
0x7fffea772930 precision:8 min <integer_cst 0x7fffea6624c8 0> max <integer_cst
0x7fffea662450 255>
            pointer_to_this <pointer_type 0x7fffea78a000>>
        SI
        size <integer_cst 0x7fffea662018 constant 32>
        unit-size <integer_cst 0x7fffea662030 constant 4>
        align:8 warn_if_not_align:0 symtab:0 alias-set -1 canonical-type
0x7fffea78a150
        domain <integer_type 0x7fffea672150 type <integer_type 0x7fffea65e000
sizetype>
            DI
            size <integer_cst 0x7fffea644dc8 constant 64>
            unit-size <integer_cst 0x7fffea644de0 constant 8>
            align:64 warn_if_not_align:0 symtab:0 alias-set -1 canonical-type
0x7fffea672150 precision:64 min <integer_cst 0x7fffea644df8 0> max <integer_cst
0x7fffea662558 3>>
        pointer_to_this <p

Looking at TREE_STRING_LENGTH:

/* In a STRING_CST */
/* In C terms, this is sizeof, not strlen.  */
#define TREE_STRING_LENGTH(NODE) (STRING_CST_CHECK (NODE)->string.length)

(gdb) p string_cst->string.length
$36 = 3

The analyzer is using this for determining the validly accessible size of the
string, which it determines is 3 bytes:

(gdb) call valid_bits.dump(true)
bytes 0-2

whereas the read is of 4 bytes:
(gdb) call actual_bits.dump(true)
bytes 0-3

Is D correctly building that string_cst?  Are D strings 0-terminated?

Reply via email to