https://gcc.gnu.org/bugzilla/show_bug.cgi?id=93092
Nicholas Krause <xerofoify at gmail dot com> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |xerofoify at gmail dot com --- Comment #4 from Nicholas Krause <xerofoify at gmail dot com> --- (In reply to Richard Biener from comment #2) > On trunk: > > (gdb) p cfun->cfg->x_last_basic_block > $1 = 8940 > (gdb) p cfun->cfg->x_n_edges > $2 = 14897 > > so we miss > > if (n_basic_blocks_for_fn (cfun) > 500 > && n_edges_for_fn (cfun) / n_basic_blocks_for_fn (cfun) >= 20) > { > vt_debug_insns_local (true); > return 0; > } > > this isn't really a good absolute limit, it's just singling out very > insane cases of very many edges (but allow scaling with the number of BBs). > > There's another limit for the dataflow problem size that is also not hit > (--param max-vartrack-size). > > There are plenty of similar bugreports (unfortunately). Richard, Not sure if this would be considered a regression fix for stage 4 but we should update these numbers to a more sane limit. I've assuming based on the ration of 25 to 1 that we should just make it something like: if (n_basic_blocks_for_fn (cfun) > 50000 && n_edges_for_fn (cfun) / n_basic_blocks_for_fn (cfun) >= 2500) { vt_debug_insns_local (true); return 0; } Not sure if we can do this for stage 4 of GCC 10 based on the comments. However we should update these numbers at least at some point in stage 1 of GCC 11. I'll send a patch either way.