On 10/28/2013 06:00 AM, r-devel-requ...@r-project.org wrote:
On 13-10-26 9:49 PM, Simon Urbanek wrote:
> On Oct 25, 2013, at 12:12 PM, Yihui Xie wrote:
>
>> This has been asked s many times that I think it may be a good
>> idea for R CMD check to just stop when the user passes a direct
was depricated in Splus, I think even before R rose to prominence. I vaguely
remember a time when it's usage generated a warning.
The fact that I've never noticed this unused routine is somewhat embarrassing. Perhaps I
need a "not documented, never called" addition to R
An unwanted side effect of the new restrictions on abrreviated names.
The anova.coxph command, in a slavish copy of anova.lm etc, returns a data frame with
column labels of
loglik Chisq Df Pr(>|Chi|)
If one tries to extract the final column of the table errors result since it is not a
Another solution is the one used for a long time in the rpart code.
The R code called "rpart1", which does the work, keeps a static pointer to the
object,
does NOT
release it's memory, and returned the size of the object.
Then the R code allocates appropriate vectors and called "rpart2", which f
Martin,
1. I'd vote for replacement.
2. The "Sig" argument was completely opaque to me. I'd vote for something
like this
bracket = "fixed" the old behavior, treat the upper and lower as fixed
values
= "search" expand the limits if needed
= "usearch" only the
Martin,
Your suggestion below did the trick. The issue was obvious once this pointed
me to the correct bit of code.
Thanks much.
Terry T.
begin included text ---
trace(loadNamespace, quote(if (package == "survival") recover()))
will break into ?recover when survival is being loaded
I have a debugging environment for the survival package, perhaps unique to me, but I find
it works very well.
To wit, a separate directory with copies of the source code but none of the package
accuements of DESCRIPTION, NAMESPACE, etc. This separate space does NOT contain a copy of
src/init.c
I'll be the "anybody" to argue that
} else {
is an ugly kludge which you will never find in my source code. Yes, it's necessary at the
command line because the parser needs help in guessing when an expression is finished, but
is only needed in that case. Since I can hardly imagine using
Minimizing and then restoring a window that contains identified points
leads to an error on my linux box.
> sessionInfo()
R version 3.0.0 (2013-04-03)
Platform: i686-pc-linux-gnu (32-bit)
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8LC_COLLATE=C
[5] LC_MON
On 04/01/2013 12:44 PM, Simon Urbanek wrote:
On Apr 1, 2013, at 1:10 PM, Terry Therneau wrote:
Assume a C program invoked by .Call, that returns a list.
Near the top of the program we allocate space for all the list elements. (It is my habit to use
"xyz2" for the name of the R
Assume a C program invoked by .Call, that returns a list.
Near the top of the program we allocate space for all the list elements. (It is my habit
to use "xyz2" for the name of the R object and "xyz" for the pointer to its contents.)
PROTECT(means2 = allocVector(REALSXP, nvar));
means
The thread is strange to me as well, since completion is logically impossible for my
Sweave files.
- an emacs buffer is open working on an .Rnw file
- there is no copy of R running anywhere on the machine
- the code chunk in the .Rnw file refers to a R data object saved somewhere
els
On 03/21/2013 10:00 AM, Simon Urbanek wrote:
I would think that the ability to hit the Tab key to trigger name
>> completion in your R GUI makes partial matching almost useless. The
>> avantage of interactive completion in the GUI is that you immediately
>> see the result of the partial matc
Note: My apolgies for the "Subject" in the original post
On 03/21/2013 08:59 AM, Milan Bouchet-Valat wrote:
Le jeudi 21 mars 2013 à 08:51 -0500, Terry Therneau a écrit :
I am not in favor of the change, which is a choice of rigor over usability.
When I am developing code or functio
As a follow-up to my previous, let me make a concrete suggestion:
Add this as one of the options
df-partial-match = allowed, warn, fail
Set the default to warn for the current R-dev, and migrate it to fail at a
later date of
your choosing.
I expect that this is very little more work fo
trates on providing you with power and on not getting in
your way."
Preface to "The C Book", M Banahan et al
Terry Therneau
On 03/21/2013 06:00 AM, r-devel-requ...@r-project.org wrote:
> Allowing partial matching on $-extraction has always been a source of
> accidents.
Thanks Martin. I had already changed the second argument to getOption.
By the way, per an off list comment from Brian R the bug I was addressing won't affect
anyone using R as shipped; the default decimal separator is "." whatever the region. It
only bit those who set the OutDec option themse
nique time points. At the very
end the numeric component "time" of the result is created using
as.numeric(levels(ftime)). It's this last line that breaks.
I could set the OutDec option within survfit and reset when I leave using on.exit. Any
other simple solutions? Any other
I think it would be a good idea. Several versions of the survival package had a duplicate
line in the S3methods, and were missing a line that should have been there, due to a
cut/paste error.
Terry T.
On 03/13/2013 06:00 AM, r-devel-requ...@r-project.org wrote:
Circa 80 CRAN and core-R pac
That was the correct direction: I changed the earler line to "routines <- list(Ccoxfit5a,
..." and the the later to .C(routnines[[1]]) and now it works as desired.
Terry T.
On 02/23/2013 03:09 AM, Duncan Murdoch wrote:
On 13-02-22 2:59 PM, Terry Therneau wrote:
I'm work
I'm working on registering all the routines in the survival package, per a request from
R-core. Two questions:
1. In the coxph routine I have this type of structure:
if (survival has 2 columns) routines <- c("coxfit5_a", "coxfit5_b",
"coxfit5_c")
else routines
Brian,
I used termplot(..., plot=FALSE) recently in R-devel: works like a charm. Thanks much
for the update.
Our in-house "gamterms" function, which this obviates, would also return the "constant"
attribute from the underlying predict(..., type="terms") call. I have occasionally found
t
Peter,
I had an earlier response to Duncan that I should have copied to the list.
The subset issue can be fixed. When the model changes character to factor, it needs to
remember the levels; just like it does with the factors. We are simply seeing a reprise
of problems that occured whem mode
don't think that's going to
happen. The reason for the difference is that the subsetting is done before the
conversion to a factor, but I think that is unavoidable without really big changes.
Duncan Murdoch
Bill Dunlap
Spotfire, TIBCO Software
wdunlap tibco.com
> -----Origin
rk. As long as there is
an option that can be overridden I'm okay. Yes, I'd prefer FALSE as the default, partly
because the current value is a tripwire in the hallway that eventually catches every new user.
Terry Therneau
On 02/11/2013 05:00 AM, r-devel-requ...@r-project.org wrote:
Bot
of table. We've been overriding both of these for 10+ years.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
In a real example I was trying to remove the class from the result of table, just because
it was to be used as a building block for other things and a simple integer vector seemed
likely to be most efficient.
I'm puzzled as to why unclass doesn't work.
> zed <- table(1:5)
> class(zed)
[1] "ta
I often run R CMD check on my package source. I've noted of late that this process gives
warning messages about
files listed in my .Rbuildignore file, where is once ignored these.
Terry T.
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/ma
1. A Surv object is a matrix with some extra attributes. The as.matrix.Surv function
removes the extras but otherwise leaves it as is.
2. The last several versions of the survival library were accidentally missing the
S3method('as.matrix', 'Surv') line from their NAMESPACE file. (Instead it's
> attr(survival:::as.matrix.Surv(ytest), 'type')
NULL
> attr(as.matrix.default(y2), 'type')
[1] "right"
Context: In testing the next survival release (2.37), it has lost this "special"
behavior. One package that depends on survival expects this behavi
stpkg not
found". Running ls() at that point shows a set of variables I didn't define, and no
evidence of any of "testpkg", "survdep", or anything else I'd defined. My prompt has been
changed to "R>" as well.
Any idea what is happening?
Terry
I have a suggested addition to termplot.
We have a local mod that is used whenever none of the termplot options is quite right. It
is used here almost daily for Cox models in order to put the y axis on a risk scale:
fit <- coxph(Surv(time, status) ~ ph.ecog + pspline(age), data=lung)
zz <
Look at rowsum. It's pretty fast C code.
Terry T
On 10/03/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
Hi,
I'm looking for a super-duper fast mean/sum binning implementation
available in R, and before implementing z = binnedMeans(x y) in native
code myself, does any one know of an exis
ource. Ok
2. I'll check the tarball soon, but I'm guessing you are right about the
error going
away.
On 09/20/2012 12:57 PM, Duncan Murdoch wrote:
On 20/09/2012 1:43 PM, Terry Therneau wrote:
I'm touching up changes to rpart and have a question with .Rbuildignore. H
I'm touching up changes to rpart and have a question with .Rbuildignore. Here
is my file
tmt1014% more .Rbuildignore
test.local
\.hg
src/print_tree.c
The source code included a module "print_tree.c", used for dubugging.
Commented
out calls to can be found here and there. I want to leave i
On 09/19/2012 09:22 AM, Duncan Murdoch wrote:
I understand the issue with time constraints on checks, and I think there are
discussions
in progress about that. My suggestion has been to put in place a way for a
tester to say
that checks need to be run within a tight time limit, and CRAN as t
essentially what is currently being forced on us, I can do such micshief as
easily as under number 1.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Some questions motivated by this discussion.
From the CRAN policy page:
"Checking the package should take as little CPU time as possible, as the CRAN check farm
is a very limited resource and there are thousands of packages. Long-running tests and
vignette code can be made optional for checking
On 09/04/2012 01:57 PM, Duncan Murdoch wrote:
On 04/09/2012 2:36 PM, Warnes, Gregory wrote:
On 9/4/12 8:38 AM, "Duncan Murdoch" wrote:
>On 04/09/2012 8:20 AM, Terry Therneau wrote:
>>
>> On 09/04/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
>> > The
On 09/04/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
The issue is not just about "CRAN" vs "off CRAN".
It is good to think about a more general scheme of
"light testing" vs "normal testing" vs "extensive testing",
e.g., for the situation where the package implements
(simulation/bootstr
el-boun...@r-project.org [mailto:r-devel-boun...@r-project.org] On
Behalf
Of Terry Therneau
Sent: Thursday, August 02, 2012 6:10 AM
To: r-devel@r-project.org; Nathaniel Smith
Subject: Re: [Rd] Rd] Numerics behind splineDesign
On 08/02/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
Now I just
us have an expectation).
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Note that the survConcordance function, which is equivalent to Kendall's
tau, also is O(n log n) and it does compute a variance. The variance
is about 4/5 of the work.
Using R 2.15.0 on an older Linux box:
> require(survival)
> require(pcaPP)
> tfun <- function(n) {
+ x <- 1:n + runif(n
On 06/05/2012 11:32 AM, Prof Brian Ripley wrote:
On 05/06/2012 16:17, Terry Therneau wrote:
I was looking at how the model.frame method for lm works and comparing
it to my own for coxph.
The big difference is that I try to retain xlevels and predvars
information for a new model frame, and lm
I was looking at how the model.frame method for lm works and comparing
it to my own for coxph.
The big difference is that I try to retain xlevels and predvars
information for a new model frame, and lm does not.
I use a call to model.frame in predict.coxph, which is why I went that
route, but nev
x and I'll look into it
further. My first foray failed because what I want (I think) is the
environment that they had when the first ">" came up. I tried baseenv
in a spot, but then my code couldn't find the model.frame function.
Terry T.
On 05/17/2012 05:00 AM
I've been tracking down a survival problem from R-help today. A short
version of the primary issue is reconstructed by the following simple
example:
library(survival)
attach(lung)
fit <- coxph(Surv(time, status) ~ log(age))
predict(fit, newdata=data.frame(abe=45))
Note the typo in the last li
ried again.
Now it works! And I have no idea what could have prompted the
change. It will remain a mystery, since I'm not going to try to bring
the error back. :-)
Terry T.
On 05/14/2012 01:19 PM, Duncan Murdoch wrote:
On 14/05/2012 1:28 PM, Terry Therneau wrote:
I'm having a
I'm having a problem rebuilding a package, new to me in R 2.15.0
(Linux) It hits all that contain the line
\usepackage[pdftex]{graphics}
and leads to the following when running R CMD check on the directory.
(I do this often; a final run on the tar.gz file will happen before
submission.)
Sinc
In that particular example the value of "4" was pulled out of the air.
There is no particular justification.
There is a strong relationship between the "effective" degrees of
freedom and the variance of the random effect, and I often find the df
scale easier to interpret. See the Hodges and
On 04/12/2012 02:15 AM, Uwe Ligges wrote:
On 12.04.2012 01:16, Paul Gilbert wrote:
On 12-04-11 04:41 PM, Terry Therneau wrote:
Context: R2.15-0 on Ubuntu.
1. I get a WARNING from CMD check for "Package vignette(s) without
corresponding PDF:
In this case the vignettes directory had bot
hat takes just a little short of forever to run.
3. Do these unprocessed package also contribute to the index via
\VignetteIndexEntry lines, or will I need to create a custom index?
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
I'l like to chime in on the subject of vignette checks.
I have one vignette in the coxme library that would be better described
as a white paper. It discusses the adequacy of the Laplace transform
under various scenarios. It contains some substantial computations, so
I'd like to mark it as "ne
9/100 of the "no visible binding" messages I've
seen over the years were misspelled variable names, and the message is a
very welcome check.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Brian & Duncan:
Thanks. This was exactly what I needed to know.
Terry
On 03/27/2012 08:41 AM, Prof Brian Ripley wrote:
On 27/03/2012 14:22, Terry Therneau wrote:
I received the following note this AM. The problem is, I'm not quite
sure how to fix it.
Can one use PROTECT(cox
I received the following note this AM. The problem is, I'm not quite
sure how to fix it.
Can one use PROTECT(coxlist(eval(PROTECT , do I create an
intermediate variable, or otherwise?
I'm willing to update the code if someone will give me a pointer to the
right documentation. This partic
On 03/27/2012 02:05 AM, Prof Brian Ripley wrote:
n 19/03/2012 17:01, Terry Therneau wrote:
R version 2.14.0, started with --vanilla
> table(c(1,2,3,4,NA), exclude=2, useNA='ifany')
1 3 4
1 1 1 2
This came from a local user who wanted to remove one particular response
from som
On 03/22/2012 11:03 AM, peter dalgaard wrote:
Don't know how useful it is any more, but back in the days, I gave this talk in
Vienna
http://www.ci.tuwien.ac.at/Conferences/useR-2004/Keynotes/Dalgaard.pdf
Looking at it now, perhaps it moves a little too quickly into the hairy stuff.
On the oth
On 03/22/2012 09:38 AM, Simon Urbanek wrote:
>
> On Mar 22, 2012, at 9:45 AM, Terry Therneau <mailto:thern...@mayo.edu>> wrote:
>
>>
>>>>> strongly disagree. I'm appalled to see that sentence here.
>>>> >
>>>> > Come o
>>> strongly disagree. I'm appalled to see that sentence here.
>> >
>> > Come on!
>> >
>>> >> The overhead is significant for any large vector and it is in
>>> >> particular unnecessary since in .C you have to allocate*and copy* space
>>> >> even for results (twice!). Also it is very er
R version 2.14.0, started with --vanilla
> table(c(1,2,3,4,NA), exclude=2, useNA='ifany')
134
1112
This came from a local user who wanted to remove one particular response
from some tables, but also wants to have NA always reported for data
checking purposes.
I do
Does .C duplicate unnecessary arguments? For instance
fit <- .C("xxx", as.integer(n), x, y, z=double(15))
The first and fourth arguments would have NAMED = 0. Is my guess that .C
won't make yet one more (unnecessary) copy correct?
(Just trying to understand).
Terry T
Three thinngs -
My original questions to R-help was "who do I talk to". That was
answered by Brian R, and the discussion of how to change Sweave moved
offline. FYI, I have a recode in hand that allows arbitrary reordering
of chunks; but changes to code used by hundreds need to be approached
ca
that impliments them, an annotated graph of the call tree next to the
section parsing a formula, etc. This is stuff that doesn't fit in
comment lines. The text/code ratio is >1. On the other hand I've
thought very little about integration of manual pages and description
files with the co
Yes, it was glaring and obvious: I had the label "description" a second
time when I really meant "details".
Still, I had to delete sections of the file 1 by 1 until it slapped me
in the face. Sorry for any bother.
Terry T.
__
R-devel@r-project.org
mething that will be glaringly obvious once it's
pointed out, but without a line number I can't seem to find it. I've
been counting braces but don't see a mismatch.
FYI, the file is below. (It is modeled on chol.Rd from the Matrix
package.)
Terry Therneau
---
Duncan's reply to my query
> Now R CMD check claims that I need Rd pages for backsolve and
> backsolve.default. I don't think I should rewrite those.
> How do I sidestep this and/or
> what other manuals should I read?
Even though your change is subtle, I'd say it's still a change
(back
Sorry to forget this
> sessionInfo()
R version 2.14.0 (2011-10-31)
Platform: x86_64-unknown-linux-gnu (64-bit)
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8LC_COLLATE=C
[5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
[
port list.
Now R CMD check claims that I need Rd pages for backsolve and
backsolve.default. I don't think I should rewrite those.
How do I sidestep this and/or
what other manuals should I read?
Perhaps do setMethod("backsolve", signature(r="ALL"),
Thank you both for the nice explanation. I added "digits=4" to my
print statements to shorten the display.
Mixed effects Cox models can have difficult numerical issues, as it
turns out; I've added this to my collection of things to watch for.
Terry Therneau
On Sat, 2011
ne is running 64 bit Unix (CentOS) and the home one 32 bit
Ubuntu.
Could this be enough to cause the difference? Most of my tests are
based on all.equal, but I also print out 1 or 2 full solutions; perhaps
I'll have to modify that?
Terry Therneau
___
I like the idea of making the functions local, and will persue it.
This issue has bothered me for a long time -- I had real misgivings when
I introduced "cluster" to the package, but did not at that time see any
way other than making it global.
I might make this change soon in the ridge functio
On Fri, 2011-11-25 at 10:42 -0500, Michael Friendly wrote:
> Duncan provided one suggestion: make ridge() an S3 generic, and
> rename ridge()
> to ridge.coxph(), but this won't work, since you use ridge() inside
> coxph() and survreg() to add a penalty term in the model formula.
> Another idea m
forth between how it seems that a formula should
work, and how it actually does work, sometimes leaves my head
spinning.
Terry T.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
for ridge, cluster, pspline, and frailty; all of whom depend deeply
on a coxph context. It would also solve a frailty() problem of long
standing, that when used in survreg only a subset of the frailty options
make sense; this is documented in the help file but catches users again
and again.
Terry
the Makefile; I've recouped that effort many times over.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Two small Sweave issues.
1. I had the following line in my code
<>
resulting in the message
Error in match.arg(options$results, c("verbatim", "tex", "hide")) :
'arg' should be one of “verbatim”, “tex”, “hide”
I puzzled on this a bit since my argument exactly matched the message,
until I
On Thu, 2011-10-06 at 10:00 -0400, Kasper Daniel Hansen wrote:
> if you're using two packages that both define a diag function/method
> you absolutely _have_ to resolve this using your NAMESPACE. [Update:
> I see both are methods. I actually don't know what happens when you
> have the same gener
The current coxme code has functions that depend on bdsmatrix and others
that depend on Matrix, both those pacakges define S4 methods for diag.
When loaded, the message appears:
replacing previous import ‘diag’ when loading ‘Matrix’
Questions:
1. Do I need to worry about this? If so, what c
ay from
it.
Thanks again for the useful comments.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
On Mon, 2011-10-03 at 12:31 -0400, Simon Urbanek wrote:
> > Thanks. I was hoping that x[,sorted] would act like "double(n)"
> does in a .C call, and have no extra copies made since it has no local
> assignment.
>
> Yes it does act the same way, you get an extra copy with double(n) as
> well - th
I'm looking at memory efficiency for some of the survival code. The
following fragment appears in coxph.fit
coxfit <- .C("coxfit2", iter=as.integer(maxiter),
as.integer(n),
as.integer(nvar), stime,
sstat,
x= x[sorted,
lem -- when I do that the error is quick and obvious.
Thanks in advance for any pointers.
Terry T.
On Sat, 2011-07-16 at 19:27 +0200, Uwe Ligges wrote:
>
> On 15.07.2011 23:23, Terry Therneau wrote:
> > I have library in development with a function that works when called
>
'x' must be an array of at least two dimensions
Adding a print statement above the rowSums call shows that the argument
is a 14 by 14 dsCMatrix.
I'm happy to send the library to anyone else to try and duplicate.
Terry Therneau
tmt% R --vanilla
> sessionInfo()
R version 2.13
l R exit
3: exit R without saving workspace
4: exit R saving workspace
Selection: 3
tmt712% ls -s test.rda
2664 test.rda
-----
The data set is too large to attach, but I can send the test.rda file
off list. The data is not confidential.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
ld use bilirubin=1.0 (upper limit of normal) and AST =
45 when the data set is one of my liver transplant studies?
Frank Harrell would argue that his "sometimes misguided" default in
cph is better than the "almost always wrong" one in coxph though, and
there is certainly some
The replies so far have helped me see the issues more clearly.
Further comments:
1. This issue started with a bug report from a user:
library(survival)
fform <- as.formula(Surv(time, status) ~ age)
myfun <- function(dform, ddata) {
predict(coxph(dform, data=ddata), newdata=ddata)
}
Gabor
s is not a complaint). The inability to
look elsewhere however has stymied my efforts to fix the scoping problem
in predict.coxph, unless I drop the env(formula) argument alltogether.
But I assume there must be good reasons for it's inclusion and am
reluctant to do so.
Terry Therneau
>
An addition to my prior post: my option 3 is not as attractive as
I thought.
In several cases predict.coxph needs to reconstruct the original data
frame, for things that were not saved in the coxph object. The half
dozen lines to redo the orignal call with the same data, na.action, etc
options
On Fri, 2011-04-15 at 09:10 +0200, peter dalgaard wrote:
> I couldn't reproduce it from Terry's description either, but there
> _is_ an issue which parallels the
I'll start with an apology: as a first guess to understand the problem
with predict.coxph I tried something parallel to Ivo's exam
On Wed, 2011-04-13 at 16:45 -0400, Simon Urbanek wrote:
> We have no details, but my wild guess would be that you did not
> re-build the package for 2.13.0 and you have static libR in 2.13.0 yet
> dynamic in 2.12.2.
>
> Cheers,
> Simon
>
Per my prior note, my guess at the root of the issue is u
On Wed, 2011-04-13 at 15:32 -0500, Dirk Eddelbuettel wrote:
> Terry,
>
> You replied to
>
> From: Terry Therneau
> To: Dirk Eddelbuettel
> Cc: c...@r-project.org
> Subject: Re: [Rd] Problem with dyn.load in R 2.13.0 -- the real problem
>
> but dropped
x27;)
Error in dyn.load("survival.so") :
unable to load shared object
'/people/biostat2/therneau/research/surv/Rtest/survival.so':
libR.so: cannot open shared object file: No such file or directory
> q()
--
Is the issue that
coxme, ...
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
nction (with my blessing), and
ceased to label it as part of "survival" in the manual.
This "metabug" can't be laid at R's feet.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Survfit had a bug in some prior releases due to the use of both
unique(times) and table(times); I fixed it by rounding to 15 digits per
the manual page for as.character. Yes, I should ferret out all the
usages instead, but this was fast and it cured the user's problem.
The bug is back! A data
On Mon, 2011-03-14 at 12:52 -0400, John Fox wrote:
> Dear Terry,
>
> Possibly I'm missing something, but since the generic drop1() doesn't have a
> test argument, why is there a problem?
>
> > args(drop1)
> function (object, scope, ...)
>
> If you use match.arg() against test, then the error
F. All of them reference
a chi-square distribution. My thought is use these arguments, and add an
error message "read the help file for drop1.coxph" when the defaults appear.
Any better suggestions?
Terry Therneau
__
R-devel@r-project.or
Simon pointed out that the issue I observed was due to internal
behaviour of unique.matrix.
I had looked carefully at the manual pages before posting the question
and this was not mentioned. Perhaps an addition could be made?
Terry T.
__
R-devel@r-p
I stumbled onto this working on an update to coxph. The last 6 lines
below are the question, the rest create a test data set.
tmt585% R
R version 2.12.2 (2011-02-25)
Copyright (C) 2011 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: x86_64-unknown-linux-gnu (64-bit)
# Lin
1 - 100 of 163 matches
Mail list logo