Another option that is similar to Enrico's is to use object oriented
programming with R6 or reference objects. I prefer the R6 package
(which will still use an environment like Enrico, but with some
different syntax and a little easier if you want to do this multiple
times.
Here is some example c
Why compute the differences manually when `aov` can do paired
comparisons on this data as is:
summary(aov(extra ~ factor(group) + Error(ID), data=sleep ))
gives the same F and P values
On Tue, Dec 27, 2022 at 3:32 AM Gabor Grothendieck
wrote:
>
> Good idea.
>
> On Mon, Dec 26, 2022 at 12:59 PM
Since `+` is already a function we could do regular piping to change this code:
mtcars %>%
ggplot(aes(x=wt, y=mpg)) +
geom_point()
to this:
mtcars %>%
ggplot(aes(x=wt, y=mpg)) %>%
`+`(geom_point())
Further we can write wrapper functions like:
p_geom_point <- function(x,...) {
x + geo
If pi were stored and computed to infinite precision then yes we would
expect tan(pi/2) to be NaN, but computers in general and R
specifically don't store to infinite precision (some packages allow
arbitrary (but still finite) precision) and irrational numbers cannot
be stored exactly. So you take
> hist(rexp(1000), main='')
> abline( v=1, col='red')
>
> sp.par <- subplot(hist(rnorm(100), main=''), x='topright')
>
> subplotadd <- function(fun, pars) {
> par(new=TRUE)
> par(pars['plt'])
>
)
> # Start new plot
> plot.new()
> # Set up coordinates on new plot
> # (though plot.window() might be better here)
> par(usr=c(0,1,0,1))
> # Start drawing
> box()
> points(c(0,1), c(0,1), pch=16, col='red', cex=3)
>
> Is there a good argument against doing t
tential responses to this issue:
1. consider that this is a rare enough issue that only Greg Snow will
ever care and he has already learned the lesson, so do nothing (other
than laugh at Greg when he forgets and has to remember to properly
order his par arguments).
2. Since this came up as an issu
R and the S language that it is based on has evolved as much as it has
been designed, so there are often inconsistencies due similar
functionality evolving from different paths. In some cases these
inconsistencies are resolved, but generally only once someone notices
and care enough to do somethin
My understanding is that R does have a float type, it is just called
"double" instead of "float".
If you are referring to a single precision floating point type, then R
does have the "as.single" function, but that does not really change
the way the number is stored, just sets a flag so that the pr
Just to anticipate future discussion from math purists (and hopefully
not to throw too much of a wrench in the works), what would be the
return of:
is.whole(-1)
or
is.whole(-1L)
?
I can see arguments for both TRUE and FALSE from both the math purity
group and the "what will happen when I try t
Uwe,
Have all of these packages found new maintainers? if not, which ones
are still looking to be adopted?
thanks,
On Fri, Aug 8, 2014 at 10:41 AM, Uwe Ligges wrote:
> Dear maintainers and R-devel,
>
> Several orphaned CRAN packages are about to be archived due to outstanding
> QC problems, but
There are editors that are R aware and have some functionality along
these lines (though I don't know of any command line type). Some to
look at are the Emacs/ESS combination or Rstudio.
On Tue, May 27, 2014 at 11:10 AM, Stavros Macrakis (Σταῦρος Μακράκης)
wrote:
> Is there a pretty-printer for
Looking at the help file and code for dput does not show any simple
way to do what you want. But the help page makes reference to the
deparse function and deparse does have a width.cutoff argument. So
you could use deparse instead of dput (the use cat or other functions
to display the results sim
>From the help page ?':' we can read:
"Value ‘to’ will be included if it differs from ‘from’ by an
integer up to a numeric fuzz of about ‘1e-7’."
So it looks like it is the intended behavior.
On Sun, Apr 27, 2014 at 5:38 AM, Hans W Borchers wrote:
> Is the following really intended behav
On Thu, Mar 20, 2014 at 7:32 AM, Dirk Eddelbuettel wrote:
[snip]
> (and some readers
>may recall the infamous Pentium bug of two decades ago).
It was a "Flaw" not a "Bug". At least I remember the Intel people
making a big deal about that distinction.
But I do remember the time well, I
If the package is on CRAN then the license should be a free one that would
let you copy whatever you want from it. However it would be most polite to
contact the original author first. I know that I have given permission for
a couple of my functions to be included in other packages where it would
On Fri, Aug 24, 2012 at 4:32 AM, S Ellison wrote:
[snip]
> Anyone out there still think statistics are easy?
There are plenty of people out there that still think statistics are
easy, after all you can always stick a bunch of numbers into Excel and
get all kinds of statistics out, they are even e
The simple work around is to use the range function, if you use
something like: xlim=range(0,x) then 0 will be included in the range
of the x axis (and if there are values less than 0 then those values
will be included as well) and the max is computed from the data as
usual. The range function wi
If you want to hide some code from accidental or casual sight then I have had
some success with adding something like this to the .R file for a package (the
petals function in TeachingDemos in this case):
.onAttach <- function(...) {
petals <- petals
attr(petals,'source') <- "Don't Cheat
For number 1, one option is to use the setHook function with the hook in
plot.new. Using this you can create a function that will be called before
every new plot is created, your function could then call par with the options
that you want, this will set the parameters on all devices. However i
Here is an attempt at the general concept without getting technical.
How many people in the world answer to the name/title "Dad"?
Yet based on context we can usually tell who someone is talking about when they
use "Dad".
It is the same in programming, I may write a function which includes a v
sday, May 04, 2011 11:43 AM
> To: Greg Snow
> Cc: Prof Brian Ripley; R Devel List
> Subject: Re: [Rd] tkrplot not working in R 2.13.0
>
>
> On May 4, 2011, at 19:26 , Greg Snow wrote:
>
> > It looks like the spaces in the path is the problem, when I run the
> line be
> -Original Message-
> From: Prof Brian Ripley [mailto:rip...@stats.ox.ac.uk]
> Sent: Wednesday, May 04, 2011 10:35 AM
> To: Greg Snow
> Cc: R Devel List
> Subject: Re: [Rd] tkrplot not working in R 2.13.0
>
> What example are you trying? The code in ?trkplot works for me
The tkrplot package is not working in version 2.13.0 for windows. I contacted
the maintainer who unfortunately does not have easy access to a windows
computer and says that it is working on the other platforms.
I traced the problem down to the line in the .First.lib function:
.Tcl(paste("load"
Another way to see your plots is the TkPredict function in the TeachingDemos
package. It will default the variables to their medians for numeric predictors
and baseline level for factors, but then you can set all of those to something
more meaningful one time using the controls, then cycle thro
x to
make sure that everything is doing what it should there.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.org
801.408.8111
> -Original Message-
> From: Prof Brian Ripley [mailto:rip...@stats.ox.ac.uk]
> Sent: Sunday, April 10,
: Monday, April 04, 2011 3:41 PM
> To: Greg Snow
> Cc: R-devel@r-project.org
> Subject: Re: [Rd] Use keep.source for function in package with lazy
> loading
>
> On Mon, 4 Apr 2011, Greg Snow wrote:
>
> > I have a function in one of my packages that I would like to print
I have a function in one of my packages that I would like to print using the
original source rather than the deparse of the function. The package uses lazy
loading and the help page for library (under keep.source) says that keep.source
does not apply to packages that use lazy loading and that w
You might consider using odfWeave, then you can create a single document, save
it as a word doc, and send it to collaborators where they can then cut and
paste from the word doc to whatever they are using.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s..
You could use winProgressBar (windows only) or TkProgressBar (tcltk package
required) instead, then nothing is output to the console/standard out but you
still have a visual of your progress.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.org
80
Why do you want to test for normality and equal variances?
If those are really a concern then you should use a method up front that is
robust against those. Those tests are usually testing a hypothesis that is
different from what you are actually interested in and generally have low power
to g
Part of the problem seems to be that R is set up to run in 1 of 2 modes (I may
be over generalizing or over simplifying here), the modes are interactive where
you type in a command, R processes it and gives results, you type in another
command, etc. The other is Batch mode where everything is p
You could use the my.symbols function in the TeachingDemos package to add the
structures to the plot (once you create the plot without the default labels and
find the positions to plot them at ).
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.or
In the mean time you could have student run the following code each time (put
it into .Rprofile or something) until they learn good coding practices:
testfunc <- function(expr, value, ok, visible) {
tmp <- deparse(expr)
if( grepl( '<- *[0-9.]+ *[])&|]', tmp ) ) {
w
I can claim some responsibility for 3 sets of functions that are in "core R",
well they are in packages, but then so is the plot function, but packages that
are loaded automatically in a default installation of R. My piece of the
responsibility is probably more the blame than credit (the credit
I am working with the mcnemar.test function and the help does not show a
maintainer/author, but it is part of the stats package.
My issue is that I want to use the test on 2 variables with possible values of
0:3, in one of the tests one of the variables does not have any 3's, so to make
sure th
(Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.org
801.408.8111
> -Original Message-
> From: Prof Brian Ripley [mailto:rip...@stats.ox.ac.uk]
> Sent: Thursday, February 11, 2010 11:05 AM
> To: Greg Snow
> Cc: r-devel
> Subject: Re: [
This problem can be seen by the following commands:
> pb <- winProgressBar(max=1000, label='0')
> b <- 1
> setWinProgressBar(pb, b, label=b)
This set of commands (on windows of course, XP in this case) causes R to crash.
This is not strictly a bug since the documentation states that the label
a
I don't understand your question. Are you trying to create a boxplot or
barplot (you mention both), what scaling is not happening automatically that
you would like?
Can you give a simple example of what you have tried, what results you are
seeing and what results you would like to see instead?
The apply function was meant to work on matrices and arrays, when you use it on
a data frame, the frame is first converted to a matrix. Since your data frame
has columns of different modes, the logical column is converted to character
and the matrix is of the single mode character. That is wha
Instead of using smooth.spline, use lm with spline terms, e.g.:
> library(splines)
> sp.fit <- lm(y~bs(x,4))
Now both use predict.lm for the predictions and all will be consistent.
Hope this helps,
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail
What research into this problem did you do that failed to turn up FAQ 7.31?
--=20
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.org
801.408.8111
> -Original Message-
> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-
> project
What research into this problem did you do that failed to turn up FAQ 7.31?
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg.s...@imail.org
801.408.8111
> -Original Message-
> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-
> project.o
Some possibilities:
The Rcmdr package is a very good example of a GUI built using Tk (it does not
hide the R program, but lets you do analyses using menus and dialogs). Rcmdr
also has a plug-in mechanism to write extensions to it, depending on what you
want to do, writing a simple extension to
I don't know about the legal definitions of all, but a few years back the
British Medical Journal had a filler article that looked at some surveys of
what people thought different words meant (you can get at the filler by going
to http://www.bmj.com/cgi/content/full/333/7565/442 and downloading
com]
> Sent: Friday, March 06, 2009 2:08 PM
> To: Prof Brian Ripley
> Cc: Greg Snow; R-devel
> Subject: Re: [Rd] quantile(), IQR() and median() for factors
>
> Dear Greg,
>
> thank you for your comments,
> as Prof. Ripley pointed out, in the case of even sample size the
>
I like the idea of median and friends working on ordered factors. Just a
couple of thoughts on possible implementations.
Adding extra checks and functionality will slow down the function. For a
single evaluation on a given dataset this slowdown will not be noticeable, but
inside of a simulati
I see the same problem on Windows XP.
But if I run loess with surface='direct' then the results are correct. So it
looks like the problem comes from the smoothing/interpolating, not the main
loess algorithm.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
greg
Just wrap the example in either \dontrun{} or
if(interactive()){
}
That way that example will be skipped when the automatic tests are done, but
will still be available for a reader to run by copy/paste or the examples
function (2nd case above).
This has worked for me, examples using these are
9 2:38 PM
> To: Greg Snow; marc_schwa...@comcast.net; ted.hard...@manchester.ac.uk
> Cc: R-Devel
> Subject: RE: [Rd] "open-ended" plot limits?
>
>
> > -Original Message-
> > From: r-devel-boun...@r-project.org
> > [mailto:r-devel-boun...@r-pr
I use range( 0, y ) rather than c(0, max(y)), that way if there are any y
values less than 0, the limits still include them (and it is slightly shorter
:-).
This also extends to cases where you may know that you will be adding
additional data using points or lines, so you can do ylim=range(0, y
: Prof Brian Ripley [mailto:rip...@stats.ox.ac.uk]
> Sent: Thursday, February 05, 2009 6:42 AM
> To: Greg Snow
> Cc: Jim Lemon; r-devel@r-project.org
> Subject: Re: [R] Is abline misbehaving?
>
> [Moved to R-devel, where it probably should have started and it is
> getti
Where do you get "should" and "expect" from? All the regular expression to=
ols that I am familiar with only match non-overlapping patterns unless you =
do extra to specify otherwise. One of the standard references for regular =
expressions if you really want to understand what is going on is "Ma
Where do you get "should" and "expect" from? All the regular expression tools
that I am familiar with only match non-overlapping patterns unless you do extra
to specify otherwise. One of the standard references for regular expressions
if you really want to understand what is going on is "Maste
I don't know if this is the case here or not, but putting in scrollbars and=
scrolling can be a bit tricky. It usually works best to create the canvas=
without a scroll command, then create the scrollbar(s), then use tkconfig =
to go back and add the scroll command to the canvas after the scroll
I don't know if this is the case here or not, but putting in scrollbars and
scrolling can be a bit tricky. It usually works best to create the canvas
without a scroll command, then create the scrollbar(s), then use tkconfig to go
back and add the scroll command to the canvas after the scrollbar
that.
Thanks again,
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
801.408.8111
> -Original Message-
> From: Simon Urbanek [mailto:[EMAIL PROTECTED]
> Sent: Tuesday, October 28, 2008 3:20 PM
> To: Greg Snow
> Cc: R-dev
I have some functions that write an external text file for postprocessing by
another program. Some instructions to the other program need to be indicated
by null values (\000 or ^@). The function currently uses code like:
writeChar(rawToChar(as.raw(0)), con)
where con is a connection to the f
Prof. Ripley,
I am sure I speak for many others when I say Thank You for this and all the
other great work that you do. R is already capable of producing high quality
graphics, this will just make them better. Kerning is one of those things that
generally don't get noticed unless done wrong/p
This is a low priority bug that has been around for a while, but I came acr=
oss it again while alpha testing 2.8.
The resulting function for splinefun gives incorrect deriv values when x is=
less than the smallest x-value used to create the function (at least in on=
e circumstance), but does the
I think it is a little more complex than just installing and checking.
tkProgressBar uses tcltk which works on the major platforms (unix/linux, mac,
windows), but only if tk is installed and available. I believe that on mac tk
is only available if X11 is used and if I remember correctly, if R
I would be interested to see how the following approach compares to the other
suggestions:
> x <- c(0,0,1,0,1,1,1,0,0,1,1,0,1,0,1,1,1,1,1,1)
> test <- c(0,0,1,0,1,2,3,0,0,1,2,0,1,0,1,2,3,4,5,6)
> out <- Reduce( function(x,y) x*y + y, x, accumulate=TRUE )
> all.equal(out,test)
[1] TRUE
For the se
ealthcare
[EMAIL PROTECTED]
(801) 408-8111
> -Original Message-
> From: Prof Brian Ripley [mailto:[EMAIL PROTECTED]
> Sent: Monday, February 25, 2008 2:06 PM
> To: Greg Snow
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] Clipping using par(plt=..., xpd=FALSE)
> in
Look again at the whole function of mean.default. All of the missing
values have already been removed if na.rm==TRUE before the call to
stats::median, so why waste time looking for missing values that are not
there.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
Here is a demonstration of behaviour that is probably an optimization by
someone far smarter than me that did not anticipate anyone wanting to do
this, but for my purposes it looks more like a bug than a feature.
I have tested this with R2.6.2 on Windows, no additional packages loaded
(beyond the
Part of the problem is that you are not adjusting for the fact that you
are smarter than the computer.
Realize that the way you are doing the assignment requires a lot of
different things behind the scenes and remember that data frames are
basically lists with some extra attributes and matricies a
Part of the problem is that you are not adjusting for the fact that you
are smarter than the computer.
Realize that the way you are doing the assignment requires a lot of
different things behind the scenes and remember that data frames are
basically lists with some extra attributes and matricies a
You may want to look at the SQLiteDF package, this allows you to put
your data into an SQLite database and treat that like a normal vector or
data frame inside of R.
Hope this helps,
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
I don't think that we need a full discussion in the Introduction, but
how about early on it shows an example of 2 floating point numbers not
being equal (and one of the work arounds like all.equal) along with a
note (bright, bold, etc.) that says that if the reader did not expect
the FALSE result t
This is a minor documentation bug. In the document: Writing R
Extensions (R-exts) section 1.6.5 (Summary -- converting an existing
package) the 3rd bullet is missing the end of the sentence.
Thanks,
--please do not edit the information below--
Version:
platform =3D i386-pc-mingw32
arch =3D
If you want all the matches (including overlaps) then you could try one
of these:
> gregexpr("(?=3Dabab)","ababab",perl=3DTRUE)
[[1]]
[1] 1 3
attr(,"match.length")
[1] 0 0
> gregexpr("ab(?=3Dab)","ababab",perl=3DTRUE)
[[1]]
[1] 1 3
attr(,"match.length")
[1] 2 2
The book "Mastering Regular Expres
If you want all the matches (including overlaps) then you could try one
of these:
> gregexpr("(?=abab)","ababab",perl=TRUE)
[[1]]
[1] 1 3
attr(,"match.length")
[1] 0 0
> gregexpr("ab(?=ab)","ababab",perl=TRUE)
[[1]]
[1] 1 3
attr(,"match.length")
[1] 2 2
The book "Mastering Regular Expressions" b
Have you read section 4 of the FAQ? If not, that would be a good place
to start.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Barry Rowlingson
> Sent: Friday, August 03, 2007 10:56 AM
> To: [EMAIL PROTECTED]
> Cc: [EMAIL PROTECTED]; Douglas Bates; R-devel List
> Subject: Re: [Rd] Compiling R for the Sony Playstation 3?
[snip]
The loess.demo function in the TeachingDemos package may help you to
understand better what is happening (both running the demo and looking
at the code). One common reason why predictions from the loess function
and hand computed predictions don't match is because the loess function
does an additi
The cor function does not know how to look inside of data frames (unless
you give it the entire data frame as the only argument). If Pollution
and Wet.days are columns of the data frame named Pollution (which I
infer from your problem statement below) then you can do things like:
> cor(Pollution$
The what argument looks at what the elements are, not the word they say.
Try this:
> tmp <- scan("C:/temp.csv",
> what=list("",0),
> sep=",")
Hope this helps,
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-811
The vcov function in package stats is already a generic, could you write
your methods for vcov rather than for var?
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
> -Original Message-
> From: [EMAIL PROTECTED]
> [mai
Another approach may be to use hooks (see ?setHook). The plot.new
function already has a hook, so you could do your option #1
automatically by setting that hook.
Better would be if all the graphics device functions had hooks (or a
common hook), then you could set that hook to set your graphics
pa
Another approach may be to use hooks (see ?setHook). The plot.new
function already has a hook, so you could do your option #1
automatically by setting that hook.
Better would be if all the graphics device functions had hooks (or a
common hook), then you could set that hook to set your graphics
pa
Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
> -Original Message-
> From: Herve Pages [mailto:[EMAIL PROTECTED]
> Sent: Friday, March 02, 2007 7:04 PM
> To: Greg Snow
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] extractin
Your 2 examples have 2 differences and they are therefore confounded in
their effects.
What are your results for:
system.time(for (i in 1:100) {row <- dat[i, ] })
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
> -Ori
If all else fails (and hopefully someone who knows more about fonts and
such can give you a better suggestion so you don't even have to try
this) then look at the last example for the subplot function in the
TeachingDemos package. This shows how you can insert images into a
plot, you could create
Have you looked at the nws package (and the nws server from the same
group), they include python to do parallel computing and may give you
the examples you need.
--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
(801) 408-8111
-Original
look at ?col and ?row. One way to use them is:
col(A)[A==11]
row(A)[A==14]
hope this helps,
Greg Snow, Ph.D.
Statistical Data Center, LDS Hospital
Intermountain Health Care
[EMAIL PROTECTED]
(801) 408-8111
>>> shanmuha boopathy <[EMAIL PROTECTED]> 09/19/05 01:30PM >>
85 matches
Mail list logo