On 13-12-13 8:36 AM, Justin Talbot wrote:

It would have those benefits, but it would be harder to prototype
changes by actually replacing the `if` function.  Implementations that
want to optimize the calls have other ways to do it, e.g. the sorts of
things the compiler does.


Does anyone actually prototype changes to the `if` function?

I don't know of any examples of that, but I can easily imagine someone wanting to. For example, some conditions take a long time to evaluate. Maybe I would want to compute both TRUE and FALSE paths in parallel in anticipation of the result, if I have cores to spare. That's pretty tricky to get right because of side effects, so prototyping in R code could make a lot of sense.


Allowing users to replace the definitions of reserved keywords and
builtins is horribly expensive performance-wise with or without
compilation.  If you look at the compiler package, the way it optimizes
these function calls is by breaking the language spec. See the
beginnings of sections 5 and 6 of Luke's write up
(http://homepage.stat.uiowa.edu/~luke/R/compiler/compiler.pdf), noting
that the *default* optimization level is 2, at which level, "In
addition to the inlining permitted by Level 1, functions that are
syntactically special or are considered core language functions and
are found via the global environment at compile time may be inlined."

This is an area where a small change to the language spec would impact
essentially no users and would result in a language that could be
executed much more efficiently.

That only breaks the language spec if the compiler doesn't detect cases where it is an invalid optimization. It may be that that is currently the case (I haven't checked), but it needn't always be. I would much prefer that the compiler code were made smarter about detecting this rather than adding exceptions to the language design.

Duncan Murdoch

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to