https://gcc.gnu.org/bugzilla/show_bug.cgi?id=98863
--- Comment #12 from Richard Biener <rguenth at gcc dot gnu.org> ---
Wow, so the MIR problem starts with
static void
df_mir_alloc (bitmap all_blocks)
{
...
EXECUTE_IF_SET_IN_BITMAP (all_blocks, 0, bb_index, bi)
{
...
bitmap_set_range (&bb_info->in, 0, DF_REG_SIZE (df));
bitmap_set_range (&bb_info->out, 0, DF_REG_SIZE (df));
I'll see if I can apply the same trick I applied to tree PRE when it
had its "maximum set".
