gs, and
certainly aren't going away. But to quote the bard "Oh what tangled
webs we weave, when first we practice to decieve."
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
A bug in the survival routines was reported to me today. The root cause
is a difference between table, unique, and sort.
> temp <- rep(c(1, sqrt(2)^2, 2), 1:3)
> unique(temp)
[1] 1 2 2
> table(temp)
temp
1 2
1 5
I'm using 2.10 on Linux, the user reported from 2.9 on Windows.
1. Minor issu
mber
these .Rnw files are the source of my .R files and of document
explaining the code, not the source of vignettes or manual pages.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
tats
methods
[8] base
Editorial comment: I said "deficiency" not "bug" above, as I'm not so
sure how good a model with 75 variables might be in the first place.
(Though in this case the user is pretty savvy.)
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
I've changed to Mercurial for my working copies of survival for a number or
resons not relevant to this post. When I do R CMD check, I get some warnings
about certain files in the .hg directory with odd names. I've added the
following 2 lines to my .Rbuildignore file without effect
^\.hg$
^\.
l generate this with just exactly the right call to
pyears, but I'll be boarding up that final door in 2.36-2.)
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
not found
Enter a frame number, or 0 to exit
1: tfun(tdata)
2: survexp.test(zed ~ 1, data = mydata)
3: eval(m, parent.frame())
4: eval(expr, envir, enclos)
5: model.frame(formula = "zed ~ 1+age+sex+year", data = mydata)
6: model.frame.default(fo
Gabor wrote:
At the above statement you have lost the environment of your formula.
>m$formula <- tform
Replace this with:
m$formula <- as.formula(tform, environment(formula))
--
No, I have not "lost" an environment. I manufactured a formula which
lacked something ne
Kevin,
The answer came from Gabor -- the model.frame function has a
non-standard evaluation, in that it uses the enviromnent attached to the
formula as the "enclosure" for looking up variable names.
This is clearly documented and I somehow missed it when reading the
page. So "reading deficit"
Ben raised an interesting point: "A better question might be how
packages get added to the *recommended*
package list (rather than how code gets added to "base R")."
As maintainer of survival one of the surprising things is the number
of packages that depend on mine. This has caused me to chang
tion{Changes in version 2.36-3}{
\itemize{
etc
and I get "cannot extract version info from the following section titles" for
all of them. I must be missing something simple.
Perhaps these two points could be clarified further in the man
I forgot to state the version of R in my last message.
R version 2.12.1 (2010-12-16)
Copyright (C) 2010 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: i686-pc-linux-gnu (32-bit)
Survival version 2.36-3 (not yet on CRAN).
__
R-
fix this, ere I complain. I discovered it running CMD check on a
package update.
Any pointers?
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
This arose when working on an addition to coxph, which has the features
that the X matrix never has an intercept column, and we remove strata()
terms before computing an X matrix. The surprise: when a terms object
is subset the intercept attribute is turned back on.
My lines 2 and 3 below were
’ ... OK
etc.
The survival package has enough test scripts that it exceeds my
terminal's scroll bar; I have to either watch closely or run
R CMD check survival >& mylog
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.eth
On Mon, 2011-02-28 at 16:57 +, Prof Brian Ripley wrote:
> Unfortunately it would need a major rewrite, and either piping output
> through a pager (surely the standard Unix way to handle this) or
> redirecting to a file is the simplest way to do this.
>
> R CMD check calls a process to run .r
>> This affects _many_ *.Rout.save checks in packages.
I assume this is in the R-devel branch.
I've got an addition to survival nearly ready to go (faster concordance
calculation). At what point should I should I switch over to the newer
version, fix up my .out files etc, to best mesh with th
coxph,
etc in this way, where the meaning is local to the enclosing function.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
I stumbled onto this working on an update to coxph. The last 6 lines
below are the question, the rest create a test data set.
tmt585% R
R version 2.12.2 (2011-02-25)
Copyright (C) 2011 The R Foundation for Statistical Computing
ISBN 3-900051-07-0
Platform: x86_64-unknown-linux-gnu (64-bit)
# Lin
Simon pointed out that the issue I observed was due to internal
behaviour of unique.matrix.
I had looked carefully at the manual pages before posting the question
and this was not mentioned. Perhaps an addition could be made?
Terry T.
__
R-devel@r-p
F. All of them reference
a chi-square distribution. My thought is use these arguments, and add an
error message "read the help file for drop1.coxph" when the defaults appear.
Any better suggestions?
Terry Therneau
__
R-devel@r-project.or
On Mon, 2011-03-14 at 12:52 -0400, John Fox wrote:
> Dear Terry,
>
> Possibly I'm missing something, but since the generic drop1() doesn't have a
> test argument, why is there a problem?
>
> > args(drop1)
> function (object, scope, ...)
>
> If you use match.arg() against test, then the error
Survfit had a bug in some prior releases due to the use of both
unique(times) and table(times); I fixed it by rounding to 15 digits per
the manual page for as.character. Yes, I should ferret out all the
usages instead, but this was fast and it cured the user's problem.
The bug is back! A data
nction (with my blessing), and
ceased to label it as part of "survival" in the manual.
This "metabug" can't be laid at R's feet.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
coxme, ...
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
x27;)
Error in dyn.load("survival.so") :
unable to load shared object
'/people/biostat2/therneau/research/surv/Rtest/survival.so':
libR.so: cannot open shared object file: No such file or directory
> q()
--
Is the issue that
On Wed, 2011-04-13 at 15:32 -0500, Dirk Eddelbuettel wrote:
> Terry,
>
> You replied to
>
> From: Terry Therneau
> To: Dirk Eddelbuettel
> Cc: c...@r-project.org
> Subject: Re: [Rd] Problem with dyn.load in R 2.13.0 -- the real problem
>
> but dropped
On Wed, 2011-04-13 at 16:45 -0400, Simon Urbanek wrote:
> We have no details, but my wild guess would be that you did not
> re-build the package for 2.13.0 and you have static libR in 2.13.0 yet
> dynamic in 2.12.2.
>
> Cheers,
> Simon
>
Per my prior note, my guess at the root of the issue is u
On Fri, 2011-04-15 at 09:10 +0200, peter dalgaard wrote:
> I couldn't reproduce it from Terry's description either, but there
> _is_ an issue which parallels the
I'll start with an apology: as a first guess to understand the problem
with predict.coxph I tried something parallel to Ivo's exam
An addition to my prior post: my option 3 is not as attractive as
I thought.
In several cases predict.coxph needs to reconstruct the original data
frame, for things that were not saved in the coxph object. The half
dozen lines to redo the orignal call with the same data, na.action, etc
options
s is not a complaint). The inability to
look elsewhere however has stymied my efforts to fix the scoping problem
in predict.coxph, unless I drop the env(formula) argument alltogether.
But I assume there must be good reasons for it's inclusion and am
reluctant to do so.
Terry Therneau
>
The replies so far have helped me see the issues more clearly.
Further comments:
1. This issue started with a bug report from a user:
library(survival)
fform <- as.formula(Surv(time, status) ~ age)
myfun <- function(dform, ddata) {
predict(coxph(dform, data=ddata), newdata=ddata)
}
Gabor
ld use bilirubin=1.0 (upper limit of normal) and AST =
45 when the data set is one of my liver transplant studies?
Frank Harrell would argue that his "sometimes misguided" default in
cph is better than the "almost always wrong" one in coxph though, and
there is certainly some
l R exit
3: exit R without saving workspace
4: exit R saving workspace
Selection: 3
tmt712% ls -s test.rda
2664 test.rda
-----
The data set is too large to attach, but I can send the test.rda file
off list. The data is not confidential.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
'x' must be an array of at least two dimensions
Adding a print statement above the rowSums call shows that the argument
is a 14 by 14 dsCMatrix.
I'm happy to send the library to anyone else to try and duplicate.
Terry Therneau
tmt% R --vanilla
> sessionInfo()
R version 2.13
lem -- when I do that the error is quick and obvious.
Thanks in advance for any pointers.
Terry T.
On Sat, 2011-07-16 at 19:27 +0200, Uwe Ligges wrote:
>
> On 15.07.2011 23:23, Terry Therneau wrote:
> > I have library in development with a function that works when called
>
I'm looking at memory efficiency for some of the survival code. The
following fragment appears in coxph.fit
coxfit <- .C("coxfit2", iter=as.integer(maxiter),
as.integer(n),
as.integer(nvar), stime,
sstat,
x= x[sorted,
On Mon, 2011-10-03 at 12:31 -0400, Simon Urbanek wrote:
> > Thanks. I was hoping that x[,sorted] would act like "double(n)"
> does in a .C call, and have no extra copies made since it has no local
> assignment.
>
> Yes it does act the same way, you get an extra copy with double(n) as
> well - th
ay from
it.
Thanks again for the useful comments.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Another solution is the one used for a long time in the rpart code.
The R code called "rpart1", which does the work, keeps a static pointer to the
object,
does NOT
release it's memory, and returned the size of the object.
Then the R code allocates appropriate vectors and called "rpart2", which f
An unwanted side effect of the new restrictions on abrreviated names.
The anova.coxph command, in a slavish copy of anova.lm etc, returns a data frame with
column labels of
loglik Chisq Df Pr(>|Chi|)
If one tries to extract the final column of the table errors result since it is not a
was depricated in Splus, I think even before R rose to prominence. I vaguely
remember a time when it's usage generated a warning.
The fact that I've never noticed this unused routine is somewhat embarrassing. Perhaps I
need a "not documented, never called" addition to R
On 10/28/2013 06:00 AM, r-devel-requ...@r-project.org wrote:
On 13-10-26 9:49 PM, Simon Urbanek wrote:
> On Oct 25, 2013, at 12:12 PM, Yihui Xie wrote:
>
>> This has been asked s many times that I think it may be a good
>> idea for R CMD check to just stop when the user passes a direct
The current coxme code has functions that depend on bdsmatrix and others
that depend on Matrix, both those pacakges define S4 methods for diag.
When loaded, the message appears:
replacing previous import ‘diag’ when loading ‘Matrix’
Questions:
1. Do I need to worry about this? If so, what c
On Thu, 2011-10-06 at 10:00 -0400, Kasper Daniel Hansen wrote:
> if you're using two packages that both define a diag function/method
> you absolutely _have_ to resolve this using your NAMESPACE. [Update:
> I see both are methods. I actually don't know what happens when you
> have the same gener
Two small Sweave issues.
1. I had the following line in my code
<>
resulting in the message
Error in match.arg(options$results, c("verbatim", "tex", "hide")) :
'arg' should be one of “verbatim”, “tex”, “hide”
I puzzled on this a bit since my argument exactly matched the message,
until I
the Makefile; I've recouped that effort many times over.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
for ridge, cluster, pspline, and frailty; all of whom depend deeply
on a coxph context. It would also solve a frailty() problem of long
standing, that when used in survreg only a subset of the frailty options
make sense; this is documented in the help file but catches users again
and again.
Terry
forth between how it seems that a formula should
work, and how it actually does work, sometimes leaves my head
spinning.
Terry T.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
On Fri, 2011-11-25 at 10:42 -0500, Michael Friendly wrote:
> Duncan provided one suggestion: make ridge() an S3 generic, and
> rename ridge()
> to ridge.coxph(), but this won't work, since you use ridge() inside
> coxph() and survreg() to add a penalty term in the model formula.
> Another idea m
I like the idea of making the functions local, and will persue it.
This issue has bothered me for a long time -- I had real misgivings when
I introduced "cluster" to the package, but did not at that time see any
way other than making it global.
I might make this change soon in the ridge functio
ne is running 64 bit Unix (CentOS) and the home one 32 bit
Ubuntu.
Could this be enough to cause the difference? Most of my tests are
based on all.equal, but I also print out 1 or 2 full solutions; perhaps
I'll have to modify that?
Terry Therneau
___
Thank you both for the nice explanation. I added "digits=4" to my
print statements to shorten the display.
Mixed effects Cox models can have difficult numerical issues, as it
turns out; I've added this to my collection of things to watch for.
Terry Therneau
On Sat, 2011
port list.
Now R CMD check claims that I need Rd pages for backsolve and
backsolve.default. I don't think I should rewrite those.
How do I sidestep this and/or
what other manuals should I read?
Perhaps do setMethod("backsolve", signature(r="ALL"),
Sorry to forget this
> sessionInfo()
R version 2.14.0 (2011-10-31)
Platform: x86_64-unknown-linux-gnu (64-bit)
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8LC_COLLATE=C
[5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
[
Duncan's reply to my query
> Now R CMD check claims that I need Rd pages for backsolve and
> backsolve.default. I don't think I should rewrite those.
> How do I sidestep this and/or
> what other manuals should I read?
Even though your change is subtle, I'd say it's still a change
(back
mething that will be glaringly obvious once it's
pointed out, but without a line number I can't seem to find it. I've
been counting braces but don't see a mismatch.
FYI, the file is below. (It is modeled on chol.Rd from the Matrix
package.)
Terry Therneau
---
Yes, it was glaring and obvious: I had the label "description" a second
time when I really meant "details".
Still, I had to delete sections of the file 1 by 1 until it slapped me
in the face. Sorry for any bother.
Terry T.
__
R-devel@r-project.org
that impliments them, an annotated graph of the call tree next to the
section parsing a formula, etc. This is stuff that doesn't fit in
comment lines. The text/code ratio is >1. On the other hand I've
thought very little about integration of manual pages and description
files with the co
Three thinngs -
My original questions to R-help was "who do I talk to". That was
answered by Brian R, and the discussion of how to change Sweave moved
offline. FYI, I have a recode in hand that allows arbitrary reordering
of chunks; but changes to code used by hundreds need to be approached
ca
Does .C duplicate unnecessary arguments? For instance
fit <- .C("xxx", as.integer(n), x, y, z=double(15))
The first and fourth arguments would have NAMED = 0. Is my guess that .C
won't make yet one more (unnecessary) copy correct?
(Just trying to understand).
Terry T
R version 2.14.0, started with --vanilla
> table(c(1,2,3,4,NA), exclude=2, useNA='ifany')
134
1112
This came from a local user who wanted to remove one particular response
from some tables, but also wants to have NA always reported for data
checking purposes.
I do
>>> strongly disagree. I'm appalled to see that sentence here.
>> >
>> > Come on!
>> >
>>> >> The overhead is significant for any large vector and it is in
>>> >> particular unnecessary since in .C you have to allocate*and copy* space
>>> >> even for results (twice!). Also it is very er
On 03/22/2012 09:38 AM, Simon Urbanek wrote:
>
> On Mar 22, 2012, at 9:45 AM, Terry Therneau <mailto:thern...@mayo.edu>> wrote:
>
>>
>>>>> strongly disagree. I'm appalled to see that sentence here.
>>>> >
>>>> > Come o
On 03/22/2012 11:03 AM, peter dalgaard wrote:
Don't know how useful it is any more, but back in the days, I gave this talk in
Vienna
http://www.ci.tuwien.ac.at/Conferences/useR-2004/Keynotes/Dalgaard.pdf
Looking at it now, perhaps it moves a little too quickly into the hairy stuff.
On the oth
On 03/27/2012 02:05 AM, Prof Brian Ripley wrote:
n 19/03/2012 17:01, Terry Therneau wrote:
R version 2.14.0, started with --vanilla
> table(c(1,2,3,4,NA), exclude=2, useNA='ifany')
1 3 4
1 1 1 2
This came from a local user who wanted to remove one particular response
from som
I received the following note this AM. The problem is, I'm not quite
sure how to fix it.
Can one use PROTECT(coxlist(eval(PROTECT , do I create an
intermediate variable, or otherwise?
I'm willing to update the code if someone will give me a pointer to the
right documentation. This partic
Brian & Duncan:
Thanks. This was exactly what I needed to know.
Terry
On 03/27/2012 08:41 AM, Prof Brian Ripley wrote:
On 27/03/2012 14:22, Terry Therneau wrote:
I received the following note this AM. The problem is, I'm not quite
sure how to fix it.
Can one use PROTECT(cox
9/100 of the "no visible binding" messages I've
seen over the years were misspelled variable names, and the message is a
very welcome check.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
I'l like to chime in on the subject of vignette checks.
I have one vignette in the coxme library that would be better described
as a white paper. It discusses the adequacy of the Laplace transform
under various scenarios. It contains some substantial computations, so
I'd like to mark it as "ne
hat takes just a little short of forever to run.
3. Do these unprocessed package also contribute to the index via
\VignetteIndexEntry lines, or will I need to create a custom index?
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
On 04/12/2012 02:15 AM, Uwe Ligges wrote:
On 12.04.2012 01:16, Paul Gilbert wrote:
On 12-04-11 04:41 PM, Terry Therneau wrote:
Context: R2.15-0 on Ubuntu.
1. I get a WARNING from CMD check for "Package vignette(s) without
corresponding PDF:
In this case the vignettes directory had bot
In that particular example the value of "4" was pulled out of the air.
There is no particular justification.
There is a strong relationship between the "effective" degrees of
freedom and the variance of the random effect, and I often find the df
scale easier to interpret. See the Hodges and
I'm having a problem rebuilding a package, new to me in R 2.15.0
(Linux) It hits all that contain the line
\usepackage[pdftex]{graphics}
and leads to the following when running R CMD check on the directory.
(I do this often; a final run on the tar.gz file will happen before
submission.)
Sinc
ried again.
Now it works! And I have no idea what could have prompted the
change. It will remain a mystery, since I'm not going to try to bring
the error back. :-)
Terry T.
On 05/14/2012 01:19 PM, Duncan Murdoch wrote:
On 14/05/2012 1:28 PM, Terry Therneau wrote:
I'm having a
I've been tracking down a survival problem from R-help today. A short
version of the primary issue is reconstructed by the following simple
example:
library(survival)
attach(lung)
fit <- coxph(Surv(time, status) ~ log(age))
predict(fit, newdata=data.frame(abe=45))
Note the typo in the last li
x and I'll look into it
further. My first foray failed because what I want (I think) is the
environment that they had when the first ">" came up. I tried baseenv
in a spot, but then my code couldn't find the model.frame function.
Terry T.
On 05/17/2012 05:00 AM
I was looking at how the model.frame method for lm works and comparing
it to my own for coxph.
The big difference is that I try to retain xlevels and predvars
information for a new model frame, and lm does not.
I use a call to model.frame in predict.coxph, which is why I went that
route, but nev
On 06/05/2012 11:32 AM, Prof Brian Ripley wrote:
On 05/06/2012 16:17, Terry Therneau wrote:
I was looking at how the model.frame method for lm works and comparing
it to my own for coxph.
The big difference is that I try to retain xlevels and predvars
information for a new model frame, and lm
Note that the survConcordance function, which is equivalent to Kendall's
tau, also is O(n log n) and it does compute a variance. The variance
is about 4/5 of the work.
Using R 2.15.0 on an older Linux box:
> require(survival)
> require(pcaPP)
> tfun <- function(n) {
+ x <- 1:n + runif(n
us have an expectation).
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
el-boun...@r-project.org [mailto:r-devel-boun...@r-project.org] On
Behalf
Of Terry Therneau
Sent: Thursday, August 02, 2012 6:10 AM
To: r-devel@r-project.org; Nathaniel Smith
Subject: Re: [Rd] Rd] Numerics behind splineDesign
On 08/02/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
Now I just
On 09/04/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
The issue is not just about "CRAN" vs "off CRAN".
It is good to think about a more general scheme of
"light testing" vs "normal testing" vs "extensive testing",
e.g., for the situation where the package implements
(simulation/bootstr
On 09/04/2012 01:57 PM, Duncan Murdoch wrote:
On 04/09/2012 2:36 PM, Warnes, Gregory wrote:
On 9/4/12 8:38 AM, "Duncan Murdoch" wrote:
>On 04/09/2012 8:20 AM, Terry Therneau wrote:
>>
>> On 09/04/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
>> > The
Some questions motivated by this discussion.
From the CRAN policy page:
"Checking the package should take as little CPU time as possible, as the CRAN check farm
is a very limited resource and there are thousands of packages. Long-running tests and
vignette code can be made optional for checking
essentially what is currently being forced on us, I can do such micshief as
easily as under number 1.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
On 09/19/2012 09:22 AM, Duncan Murdoch wrote:
I understand the issue with time constraints on checks, and I think there are
discussions
in progress about that. My suggestion has been to put in place a way for a
tester to say
that checks need to be run within a tight time limit, and CRAN as t
I'm touching up changes to rpart and have a question with .Rbuildignore. Here
is my file
tmt1014% more .Rbuildignore
test.local
\.hg
src/print_tree.c
The source code included a module "print_tree.c", used for dubugging.
Commented
out calls to can be found here and there. I want to leave i
ource. Ok
2. I'll check the tarball soon, but I'm guessing you are right about the
error going
away.
On 09/20/2012 12:57 PM, Duncan Murdoch wrote:
On 20/09/2012 1:43 PM, Terry Therneau wrote:
I'm touching up changes to rpart and have a question with .Rbuildignore. H
Look at rowsum. It's pretty fast C code.
Terry T
On 10/03/2012 05:00 AM, r-devel-requ...@r-project.org wrote:
Hi,
I'm looking for a super-duper fast mean/sum binning implementation
available in R, and before implementing z = binnedMeans(x y) in native
code myself, does any one know of an exis
I have a suggested addition to termplot.
We have a local mod that is used whenever none of the termplot options is quite right. It
is used here almost daily for Cox models in order to put the y axis on a risk scale:
fit <- coxph(Surv(time, status) ~ ph.ecog + pspline(age), data=lung)
zz <
stpkg not
found". Running ls() at that point shows a set of variables I didn't define, and no
evidence of any of "testpkg", "survdep", or anything else I'd defined. My prompt has been
changed to "R>" as well.
Any idea what is happening?
Terry
> attr(survival:::as.matrix.Surv(ytest), 'type')
NULL
> attr(as.matrix.default(y2), 'type')
[1] "right"
Context: In testing the next survival release (2.37), it has lost this "special"
behavior. One package that depends on survival expects this behavi
1. A Surv object is a matrix with some extra attributes. The as.matrix.Surv function
removes the extras but otherwise leaves it as is.
2. The last several versions of the survival library were accidentally missing the
S3method('as.matrix', 'Surv') line from their NAMESPACE file. (Instead it's
I often run R CMD check on my package source. I've noted of late that this process gives
warning messages about
files listed in my .Rbuildignore file, where is once ignored these.
Terry T.
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/ma
In a real example I was trying to remove the class from the result of table, just because
it was to be used as a building block for other things and a simple integer vector seemed
likely to be most efficient.
I'm puzzled as to why unclass doesn't work.
> zed <- table(1:5)
> class(zed)
[1] "ta
of table. We've been overriding both of these for 10+ years.
Terry Therneau
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
rk. As long as there is
an option that can be overridden I'm okay. Yes, I'd prefer FALSE as the default, partly
because the current value is a tripwire in the hallway that eventually catches every new user.
Terry Therneau
On 02/11/2013 05:00 AM, r-devel-requ...@r-project.org wrote:
Bot
don't think that's going to
happen. The reason for the difference is that the subsetting is done before the
conversion to a factor, but I think that is unavoidable without really big changes.
Duncan Murdoch
Bill Dunlap
Spotfire, TIBCO Software
wdunlap tibco.com
> -----Origin
Peter,
I had an earlier response to Duncan that I should have copied to the list.
The subset issue can be fixed. When the model changes character to factor, it needs to
remember the levels; just like it does with the factors. We are simply seeing a reprise
of problems that occured whem mode
1 - 100 of 163 matches
Mail list logo