On Wed, 23 Apr 2014, Jan Hubicka wrote:

+  /* Now optimize (x != 0) ? x + y : y to just y.
+     The following condition is too restrictive, there can easily be
another
+     stmt in middle_bb, for instance a CONVERT_EXPR for the second
argument.  */
+  gimple assign = last_and_only_stmt (middle_bb);
+  if (!assign || gimple_code (assign) != GIMPLE_ASSIGN
+      || gimple_assign_rhs_class (assign) != GIMPLE_BINARY_RHS
+      || (!INTEGRAL_TYPE_P (TREE_TYPE (arg0))
+         && !POINTER_TYPE_P (TREE_TYPE (arg0))))
+    return 0;
+
+  /* assign may call a libgcc routine, which is slow.  */
+  if (!is_cheap_stmt (assign) && is_cheap_stmt (cond))
+    return 0;
+
+  /* If the special case has a high probability, keep it.  */
+  if (EDGE_PRED (middle_bb, 0)->probability < PROB_EVEN)
+    return 0;

Well, I would expect this transformation to be win always, + operation
is virtually for free.

Concerning profile - you must consider two cases.  If the profile is guessed
then the probability is most probably 71% to the non-zero case unless user
used an expect (since I would expect only PRED_OPCODE_NONEQUAL to match).
This is data dependent branch unless this sits in a loop and those are badly
predicted statically.

If the probability is read, then probabilities over (say) 10% and less than
90% means that the branch is likely not very well predictable (it still may
be on advanced architectures if it is predictable from context) and getting
rid of it is a good idea.

So if you want to disable the optimization for x being likely zero, I guess
testing that probability is over PROV_LIKELY is a resonable threshold - it
will handle both builtin_expect and the (very) badly predictable branches.
Maybe for FDO it should be higher, but we would need to do some research on
it that is probably not worth the effort.

The division transformation is bit different story, since cost of division
is more than cost of branch and the 50% threshold is a limit for one value
counter to be reliable.

Thank you for the comments. If I understand correctly:

- it is always ok to look at edge->probability (no need to check that the
  probabilities are available as Richard feared)
- PROB_EVEN is an ok threshold for division (not sure I understood your last
  sentence)
- for cheaper operations like addition, I should be less conservative and do
  the transformation always, or use a threshold of PROB_UNLIKELY.

Are there other operations than division (among those listed in
neutral_element_p or absorbing_element_p) that fall in the "expensive"
category? I guess there are some platforms where multiplication is
expensive, but few, and querying the target for the cost of operations
seems exagerated. So I would go with:

[move definition of code_def]
if ((code_def == TRUNC_DIV_EXPR || code_def == CEIL_DIV_EXPR
     || code_def == FLOOR_DIV_EXPR || code_def == ROUND_DIV_EXPR
     || code_def == EXACT_DIV_EXPR)
    && EDGE_PRED (middle_bb, 0)->probability < PROB_EVEN)
  return 0;

(and change the testsuite example with __builtin_expect to use division)

or (my favorite):

[move definition of code_def]
int threshold = PROB_UNLIKELY;
if (code_def == TRUNC_DIV_EXPR || code_def == CEIL_DIV_EXPR
    || code_def == FLOOR_DIV_EXPR || code_def == ROUND_DIV_EXPR
    || code_def == EXACT_DIV_EXPR)
  threshold = PROB_EVEN;
if (EDGE_PRED (middle_bb, 0)->probability < threshold)
  return 0;


Is that ok, after re-testing?

--
Marc Glisse

Reply via email to