Brian, >If we make functions generic, we rely on package writers implementing the >documented >semantics (and that is not easy to check). That was deemed to be too >easy to get wrong for var().
Hard to argue with a considered decision, but the alternative facing increasing numbers of package developers seems to me to be pretty bad too ... There are two ways a package developer can currently get a function tailored to their own new class. One is to rely on a generic function to launch their class-specific instance, and write only the class-specific instance. That may indeed be hard to check, though I would be inclined to think that is the package developer's problem, not the core team's. But it has (as far as I know today ...?) no wider impact. The other option, with no existing generic, is to mask the original function by writing a new generic function that respects the original syntax exactly, and then implement a fun.default that replicates the original non-generic function's behaviour, hopefully by calling it directly. As an example, library(circular) masks stats::var, though I'm fairly sure its not the only case. This has obvious disadvantages, including potentially system-wide (R-wide at least!) impact and unfavourable interactions between packages masking each other's generics and defaults). I will use masking if I have to, at least for my own local use where its only me that suffers if (when?) I get it wrong. But the idea makes me very nervous, especially if I imagine folk who _don't_ get as nervous at the idea. Hence the feeling that wider use of generics for fundamental and common functions might make for a safer world. Steve E ******************************************************************* This email and any attachments are confidential. Any use, co...{{dropped}} ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel