[Rd] plot.TukeyHSD (PR#8229)

2005-10-20 Thread ehlers
Full_Name: Peter Ehlers
Version: "R version 2.2.0, 2005-10-19"
OS: Windows XP
Submission from: (NULL) (136.159.71.162)


The newly added column of adjusted p-values in TukeyHSD output causes a problem
with plotting the confidence intervals; an extraneous vertical line segment is
plotted. 

plot.TukeyHSD expects a list of n-by-3 matrices. These are now n-by-4.

Suggested fix:
Replace:
plot.TukeyHSD <- function (x, ...)
{
for (i in seq(along = x)) {
xi <- x[[i]]

with:
plot.TukeyHSD <- function (x, ...)
{
for (i in seq(along = x)) {
xi <- x[[i]]
xi <- xi[, -4]

I apologize if this is a known bug; I didn't see anything in my search.

Peter

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] PR#8229

2005-10-27 Thread ehlers

platform i386-pc-mingw32
arch i386
os   mingw32
system   i386, mingw32
status   Patched
major2
minor2.0
year 2005
month10
day  24
svn rev  36016
language R


Re: plot.TukeyHSD

Thanks for the recent quick fix. Another small fix is needed.
I should have tested more thoroughly.

Replace:

xi <- x[[i]][, -4]

with:

xi <- x[[i]][, -4, drop = FALSE]

in order to keep nrow() functioning properly in

yvals <- nrow(xi):1

when xi has only one row as in the helpfile example with

plot(TukeyHSD(fm1, "wool"))


Peter Ehlers
U of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Yates' correction for continuity in chisq.test (PR#8265)

2005-10-31 Thread ehlers


Prof Brian Ripley wrote:

> On Sun, 30 Oct 2005, P Ehlers wrote:
> 
>> [EMAIL PROTECTED] wrote:
>>
>>> Full_Name: foo ba baz
>>> Version: R2.2.0
>>> OS: Mac OS X (10.4)
>>> Submission from: (NULL) (219.66.32.183)
>>>
>>>
>>> chisq.test(matrix(c(9,10,9,11),2,2))
>>>
>>> Chi-square value must be 0, and, P value must be 0
>>> R does over correction
>>>
>>> when | a d - b c | < n / 2 ,chi-sq must be 0
>>
>>
>> (Presumably, you mean P-value = 1.)
>> If you don't want the correction, set correct=FALSE. (The
>> results won't differ much.)
>>
>> A better example is
>>
>>  chisq.test(matrix(c(9,10,9,10),2,2))
>>
>> for which R probably should return X-squared = 0.
> 
> 
> R is using the correction that almost all the sources I looked at 
> suggest. You can't go around adjusting X^2 for just some values of the 
> data: the claim is that the adjusted statistic has a more accurate chisq 
> distribution under the null.
> 
> I think at this remove it does not matter what Yates' suggested 
> (although if I were writing a textbook I would find out), especially as 
> the R documentation does not mention Yates.
> 

You're quite right that, for consistency, the correction should be
applied even in the silly example I gave. And, of course, one
should not be doing a chisquare test on silly examples.

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] pbinom with size argument 0 (PR#8560)

2006-02-04 Thread ehlers


(Ted Harding) wrote:
> On 03-Feb-06 Peter Dalgaard wrote:
> 
>>(Ted Harding) <[EMAIL PROTECTED]> writes:
>>
>>
>>>On 03-Feb-06 [EMAIL PROTECTED] wrote:
>>>
>>>>Full_Name: Uffe Høgsbro Thygesen
>>>>Version: 2.2.0
>>>>OS: linux
>>>>Submission from: (NULL) (130.226.135.250)
>>>>
>>>>
>>>>Hello all.
>>>>
>>>>  pbinom(q=0,size=0,prob=0.5)
>>>>
>>>>returns the value NaN. I had expected the result 1. In fact any
>>>>value for q seems to give an NaN.
>>>
>>>Well, "NaN" can make sense since "q=0" refers to a single sampled
>>>value, and there is no value which you can sample from "size=0";
>>>i.e. sampling from "size=0" is a non-event. I think the probability
>>>of a non-event should be NaN, not 1! (But maybe others might argue
>>>that if you try to sample from an empty urn you necessarily get
>>>zero "successes", so p should be 1; but I would counter that you
>>>also necessarily get zero "failures" so q should be 1. I suppose
>>>it may be a matter of whether you regard the "r" of the binomial
>>>distribution as referring to the "identities" of the outcomes
>>>rather than to how many you get of a particular type. Hmmm.)
>>>
>>>
>>>>Note that
>>>>
>>>>  dbinom(x=0,size=0,prob=0.5)
>>>>
>>>>returns the value 1.
>>>
>>>That is probably because the .Internal code for pbinom may do
>>>a preliminary test for "x >= size". This also makes sense, for
>>>the cumulative p for any  with a finite range,
>>>since the answer must then be 1 and a lot of computation would
>>>be saved (likewise returning 0 when x < 0). However, it would
>>>make even more sense to have a preceding test for "size<=0"
>>>and return NaN in that case since, for the same reasons as
>>>above, the result is the probability of a non-event.
>>
>>Once you get your coffee, you'll likely realize that you got
>>your p's and d's mixed up...
> 
> 
> You're right about the mix-up! (I must mend the pipeline.)
> 
> 
>>I think Uffe is perfectly right: The result of zero experiments will
>>be zero successes (and zero failures) with probability 1, so the
>>cumulative distribution function is a step function with one step at
>>zero ( == as.numeric(x>=0) ).
> 
> 
> I'm perfectly happy with this argument so long as it leads to
> dbinom(x=0,size=0,prob=p)=1 and also pbinom(q=0,size=0,prob=p)=1
> (which seems to be what you are arguing too). And I think there
> are no traps if p=0 or p=1.
> 
> 
>>>(But it depends on your point of view, as above ... However,
>>>surely the two  should be consistent with each other.)
> 
> 
> Ted.

I prefer a (consistent) NaN. What happens to our notion of a
Binomial RV as a sequence of Bernoulli RVs if we permit n=0?
I have never seen (nor contemplated, I confess) the definition
of a Bernoulli RV as anything other than some dichotomous-outcome
one-trial random experiment. Not n trials, where n might equal zero,
but _one_ trial. I can't see what would be gained by permitting a
zero-trial experiment. If we assign probability 1 to each outcome,
we have a problem with the sum of the probabilities.

Peter Ehlers
> 
> 
> E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
> Fax-to-email: +44 (0)870 094 0861
> Date: 03-Feb-06   Time: 15:07:49
> -- XFMail --
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] `mgp[1:3]' are of differing sign (PR#14130)

2009-12-12 Thread ehlers
Peter Dalgaard wrote:
> Peter Ehlers wrote:
>>
>> cornell.p.gonsch...@iem.fh-friedberg.de wrote:
>>> Full_Name: Cornell Gonschior
>>> Version: 2.10.0
>>> OS: Linux
>>> Submission from: (NULL) (212.201.28.40)
>>>
>>>
>>> Hi,
>>>
>>> in the introduction to R, you can find the following sentence in the 
>>> par()
>>> chapter:
>>> "Use tck=0.01 and mgp=c(1,-1.5,0) for internal tick marks."
>>> I thought that's nice, because I wanted to have tick marks and tick 
>>> labels
>>> inside and the axis title outside.
>>>
>>> But:
>>>> plot(z, las=1, tck=0.01, mgp=c(1,-1.5,0))
>>> Warnmeldungen:
>>> 1: In plot.window(...) : `mgp[1:3]' are of differing sign
>>> 2: In plot.xy(xy, type, ...) : `mgp[1:3]' are of differing sign
>>> 3: In axis(side = side, at = at, labels = labels, ...) :
>>>   `mgp[1:3]' are of differing sign
>>> 4: In axis(side = side, at = at, labels = labels, ...) :
>>>   `mgp[1:3]' are of differing sign
>>> 5: In box(...) : `mgp[1:3]' are of differing sign
>>> 6: In title(...) : `mgp[1:3]' are of differing sign
>>>
>>>> par(las=1, tck=0.01, mgp=c(1,-1,0))
>>> Warnmeldung:
>>> In par(las = 1, tck = 0.01, mgp = c(1, -1, 0)) :
>>>   `mgp[1:3]' are of differing sign
>>>
>>> Was there a recent change, couldn't find anything useful searching 
>>> the web.
>>>
>>> Regards,
>>> Cornell
>>
>> Well, it's only a warning, making you aware of a possibly
>> unintended par setting. Warnings are good things but if you
>> don't want to see them, they can be suppressed.
>>
>> Certainly not a bug.
> 
> Hmm, then again, I tend to agree with Cornell that there are a bit too 
> many cases where mgp[1:3]' would sensibly have differing sign, compared 
> to cases where it is a mistake. In addition to internal tick marks and 
> labels, there are also cases where the whole axis is shifted into the 
> plot area. I'd more likely use axis(pos=...) for that, but still.
> 
And, I suppose, if R-Intro mentions it, it would be less confusing
if there were mention of the warning. But use of R conditions one to
seeing warnings as a good thing. Can't count the number of times
I've seen "longer ... is not a multiple of shorter ..." and having
it catch a silly user error.

-- 
Peter Ehlers
University of Calgary
403.202.3921

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] dump (PR#9282)

2006-10-09 Thread ehlers
Full_Name: Ricardo Ehlers
Version: 2.4.0
OS: Linux
Submission from: (NULL) (200.138.34.134)


The dump function is outputing something different from previous versions. For
example

> a <- c(1,2,3); dump('a',file='test')

now results in `a` <- c(1,2) instead of "a" <- c(1,2) in the file 'test'

This prevents using the bayesmix package and running JAGS from within R in an
automatic fashion.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] buglet in plot.lm (PR#9333)

2006-11-02 Thread ehlers
In version 2.2.0, plot.lm() was enhanced with two new plots
(thanks, Martin and John).

There's a small fix needed for 'which=6' when 'id.n=0'.

 > plot(lm(rnorm(10) ~ 1), which = 6, id.n = 0)
Error in plot.lm(lm(rnorm(10) ~ 1), which = 6, id.n = 0) :
 could not find function "text.id"

Fixed by inserting braces in plot.lm.R:
Replace code at line last-8:

if (id.n > 0)
show.r <- order(-cook)[iid]
text.id(g[show.r], cook[show.r], show.r)

with

if (id.n > 0) {
show.r <- order(-cook)[iid]
text.id(g[show.r], cook[show.r], show.r)
}


 > version
_
platform   i386-pc-mingw32
arch   i386
os mingw32
system i386, mingw32
status Patched
major  2
minor  4.0
year   2006
month  10
day29
svn rev39744
language   R
version.string R version 2.4.0 Patched (2006-10-29 r39744)

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Typo(s) in proc.time.Rd and comment about ?proc.time (PR#8092)

2005-08-24 Thread P Ehlers
David,

Re: "of the order of ..." vs "on the order of ..."

My Oxford dictionary has "of (or 'in' or 'on') the order of",
so that there is no correction needed in proc.time.Rd. And, yes,
I do hear 'of', at least on the northern side of the 49th
parallel. But please don't let these comments discourage
further documentation improvement suggestions.

Peter Ehlers
U. of Calgary (Canada)

[EMAIL PROTECTED] wrote:
> [EMAIL PROTECTED] wrote:
> 
>>On Wed, 24 Aug 2005 [EMAIL PROTECTED] wrote:
> 
> 
>>>I just downloaded the file
>>>
>>>ftp://ftp.stat.math.ethz.ch/Software/R/R-devel.tar.gz
>>>
>>>and within proc.time.Rd, the second paragraph of the \value
>>>section contains a typo:
> 
> 
>>I believe your understanding of the English language is different from the 
>>author here, who is English.  (You on the other hand seem to think there 
>>is no need to give your country in your address when writing an addess in 
>>Denmark.)  The preferred language for R documentation is English (and not 
>>American).
> 
> 
>>>The resolution of the times will be system-specific; it is common for
>>>them to be recorded to of the order of 1/100 second, and elapsed [...]
>>>^
>>>
>>>I'd say replacing "to of" with just "of" would grammatically
>>>fix the sentence.
> 
> 
>>I'd say it was correct and your correction is incorrect.  In English we 
>>say `recorded to 1/100th of a second', not `recorded 1/100th second'.
> 
> 
> The correction was incorrect, but so was the original.  I've never
> heard the expression "of the order of"; common usage (in English or
> American, as far as I know) is "on the order of".  Your "recorded to
> 1/100th of a second" is also ok.
> 
> 
>>>Second, the \note{} section for Unix-like machines reads:
>>>
>>>It is possible to compile \R without support for \code{proc.time},
>>>when the function will throw an error.
>>>
>>>I believe this is ungrammatical and suggest replacing
>>>"when the function will throw an error" with "in which
>>>case the function will throw an error".
> 
> 
>>Again, the statement given is the intended meaning.
> 
> 
> I think more clear might be, "it is possible to compile R without
> support for proc.time, when the function *would* throw an error".
> 
> -- Dave
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] download problem: Windows binaries - patched/devel versions

2005-10-24 Thread P Ehlers
Lately, I get a "page cannot be found" error (IE 6.0, Windows XP)
when I try to download the following:

http://cran.us.r-project.org/R-2.2.0pat-win32.exe
http://cran.us.r-project.org/R-2.3.0dev-win32.exe

(Same for all other mirrors (about 6) that I've tried.
Could someone please check if these files are accessible to them?
I'm guessing there's something haywire in my setup.

Note that I can ftp the files without problem.

-- 
Peter Ehlers
Department of Mathematics and Statistics
University of Calgary, 2500 University Dr. NW   ph: 403-220-3936

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Problem with Rchtml.dll in R-2.3.0dev-win32.exe?

2005-10-24 Thread P Ehlers
Not really a problem, but I can't figure this out:
(OS: Windows XP SP2)

I have both R-patched ("R version 2.2.0, 2005-10-21")
and R-devel ("R version 2.3.0, 2005-10-21") installed.
For both, I set options(chmhelp = TRUE).
R-patched has no problems, but in R-devel, attempts to access
help result in a Windows error message:

"The procedure entry point [EMAIL PROTECTED] could not be located
in the dynamic link library hhctrl.ocx."

with an R error message:

Error in dyn.load(x, as.logical(local), as.logical(now)) :
  unable to load shared library
'C:/StatisticsPrograms/R/Rdevel/bin/Rchtml.dll':
  LoadLibrary failure:  The specified procedure could not be found.

Replacing Rchtml.dll in \bin with that from R-patched fixes the
problem.

A hex file compare claims the two versions of Rchtml.dll are
identical, except for the creation dates.

Any ideas?


Peter

-- 
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] download problem: Windows binaries - patched/devel versions

2005-10-24 Thread P Ehlers
Thanks for the quick response.

-Peter

Prof Brian Ripley wrote:
> The addresses are wrong.  We know there are some incorrect links due to 
> mirroring errors, but
> 
> http://cran.r-project.org/bin/windows/base/R-2.2.0pat-win32.exe
> 
> etc are there.
> 
> On Mon, 24 Oct 2005, P Ehlers wrote:
> 
>> Lately, I get a "page cannot be found" error (IE 6.0, Windows XP)
>> when I try to download the following:
>>
>> http://cran.us.r-project.org/R-2.2.0pat-win32.exe
>> http://cran.us.r-project.org/R-2.3.0dev-win32.exe
>>
>> (Same for all other mirrors (about 6) that I've tried.
>> Could someone please check if these files are accessible to them?
>> I'm guessing there's something haywire in my setup.
>>
>> Note that I can ftp the files without problem.
> 
> 

-- 
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Yates' correction for continuity in chisq.test (PR#8265)

2005-10-30 Thread P Ehlers

[EMAIL PROTECTED] wrote:
> Full_Name: foo ba baz
> Version: R2.2.0
> OS: Mac OS X (10.4)
> Submission from: (NULL) (219.66.32.183)
> 
> 
> chisq.test(matrix(c(9,10,9,11),2,2))
> 
> Chi-square value must be 0, and, P value must be 0
> R does over correction
> 
> when | a d - b c | < n / 2 ,chi-sq must be 0
> 

(Presumably, you mean P-value = 1.)
If you don't want the correction, set correct=FALSE. (The
results won't differ much.)

A better example is

  chisq.test(matrix(c(9,10,9,10),2,2))

for which R probably should return X-squared = 0.

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Problem with fitdistr for gamma in R 2.2.0

2005-11-17 Thread P Ehlers
Gregor,

  fitdistr(otm, "gamma", method="L-BFGS-B")

works for me (on WinXP). Or you could specify "lower = 0".

I no longer have 2.1.0 running, so I don't know why this
wasn't needed in 2.1.0.

"R version 2.2.0, 2005-10-24"
MASS version: 7.2-20

-peter

Gorjanc Gregor wrote:
> Dear R developers,
> 
> I have encountered strange behaviour of fitdistr for gamma in recent R
> build i.e. 2.2.0. I have attached the code for data at the end of this mail
> so you can reproduce the problem. In short, I am able to run fitdistr under
> 2.1.0 without problems, while I get the following error under 2.2.0
> (Version 2.2.0 Patched (2005-11-15 r36348))
> 
> 
>>fitdistr(otm, "gamma")
> 
> Error in densfun(x, parm[1], parm[2], ...) : 
> 'shape' must be strictly positive
> 
> The results on 2.1.1 (Version 2.1.1 (2005-06-20)) are
> 
> 
>>fitdistr(otm, "gamma")
> 
> shape   rate  
>   1.030667   0.189177 
>  (0.090537) (0.021166)
> 
> Platform: Windows XP
> 
> Thank you in advance for your effort on this remarkable tool!
> 
> Here is the data for above problem/results:
> 
> "otm" <-
> c(0.059610966029577, 0.0591496321922168, 0.14, 0.18, 0.24, 0.25, 
> 0.270071982912719, 0.270758049933706, 0.269911804412492, 0.280138451903593, 
> 0.279787947586738, 0.279429937571753, 0.3, 0.320746235495899, 
> 0.319553311037365, 0.51, 0.54, 0.56, 0.6, 0.609812622915953, 
> 0.609198293855879, 0.64, 0.69, 0.74, 0.76, 0.770972826186568, 
> 0.769288654833566, 0.78, 0.789181584270671, 0.78991363293305, 
> 0.8, 0.89, 0.900691718998831, 0.8991656800583, 0.92, 0.93, 0.94, 
> 1.01, 1.02, 1.13, 1.18, 1.26, 1.29, 1.33, 1.42, 1.43, 1.47, 1.47940529614314, 
> 1.47920716832764, 1.6, 1.61, 1.63, 1.68938231960637, 1.6894849291523, 
> 1.82, 1.88088044053270, 1.8792804789003, 1.89, 1.92, 2, 2.04, 
> 2.07, 2.12, 2.17, 2.18, 2.22, 2.23, 2.27, 2.28, 2.3, 2.32092240267433, 
> 2.31912300181622, 2.38, 2.39, 2.43, 2.46, 2.51, 2.52, 2.55, 2.56, 
> 2.61, 2.66091404781397, 2.6595832825806, 2.67, 2.7, 2.77, 2.8, 
> 2.81, 2.86, 2.87, 2.93, 3.01, 3.05, 3.14, 3.15, 3.17, 3.18, 3.24, 
> 3.26, 3.33, 3.44, 3.45, 3.52, 3.55, 3.63, 3.73, 3.9, 4, 4.01, 
> 4.04, 4.13, 4.15934497380769, 4.16094719917513, 4.3, 4.33, 4.34, 
> 4.66, 4.76, 4.82, 4.83, 4.89, 4.92, 5.06, 5.14, 5.16, 5.26, 5.31, 
> 5.36, 5.48, 5.66, 5.79, 5.8, 5.85, 5.87, 5.92952534468565, 5.92962284128508, 
> 6.04, 6.11, 6.13, 6.16, 6.19, 6.42, 6.66, 6.69, 7.11, 7.16, 7.29, 
> 7.3, 7.31, 7.33, 7.72, 7.82, 7.87, 7.91, 8.01, 8.17, 8.45, 8.49, 
> 8.73, 8.86, 8.95, 9, 9.05, 9.13, 9.22, 9.52, 9.82, 9.88, 9.91, 
> 9.99, 10.03, 10.4, 10.59, 10.83, 11.06, 11.64, 11.85, 12.02, 
> 12.4, 12.64, 12.96, 13.44, 14.06, 14.07, 14.37, 15.4, 15.6, 15.92, 
> 16.23, 16.6, 16.97, 17.06, 17.8, 18.69, 18.73, 19.2, 19.51, 19.54, 
> 20.57, 21.05, 22.23, 27.02)
> 
> Lep pozdrav / With regards,
> Gregor Gorjanc
> 
> --
> University of Ljubljana
> Biotechnical FacultyURI: http://www.bfro.uni-lj.si/MR/ggorjan
> Zootechnical Department mail: gregor.gorjanc  bfro.uni-lj.si
> Groblje 3   tel: +386 (0)1 72 17 861
> SI-1230 Domzale fax: +386 (0)1 72 17 888
> Slovenia, Europe
> ------
> "One must learn by doing the thing; for though you think you know it,
>  you have no certainty until you try." Sophocles ~ 450 B.C.
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Problem with fitdistr for gamma in R 2.2.0

2005-11-17 Thread P Ehlers
I think the problem may lie with fitdistr().
Specifically, replacing the code in fitdistr.R (VR_7.2-20)
(line 137 to end) with the code in VR_7.2-8 (line 92 to end)
seems to handle

   fitdistr(otm, "gamma")

just fine. But I haven't done much testing.

Peter Ehlers

Peter Dalgaard wrote:
> P Ehlers <[EMAIL PROTECTED]> writes:
> 
> 
>>Gregor,
>>
>>  fitdistr(otm, "gamma", method="L-BFGS-B")
>>
>>works for me (on WinXP). Or you could specify "lower = 0".
> 
> 
> The really odd thing is that it even works with
> 
> 
>> fitdistr(otm, "gamma",lower=-Inf)
> 
>  shape rate
>   1.03081094   0.18924370
>  (0.09055117) (0.02117350)
> 
> or even
> 
> 
>> fitdistr(otm, "gamma",upper=Inf)
> 
>  shape rate
>   1.03081094   0.18924370
>  (0.09055117) (0.02117350)
> 
> 
> Also
> 
> 
>> fitdistr(otm, "gamma",control=list(parscale=c(.1,.1)))
> 
>  shape rate
>   1.03079500   0.18923897
>  (0.09055106) (0.02117363)
> 
> and quite amusingly:
> 
> 
> 
>> fitdistr(otm, "gamma",method="BFGS",lower=0)
> 
>  shape rate
>   1.03081096   0.18924371
>  (0.09055118) (0.02117350)
> Warning message:
> bounds can only be used with method L-BFGS-B in: optim(x = 
> c(0.059610966029577, 0.0591496321922168, 0.14, 0.18,
> 
>> fitdistr(otm, "gamma",method="CG",lower=0)
> 
>  shape rate
>   1.03081096   0.18924371
>  (0.09055118) (0.02117350)
> Warning message:
> bounds can only be used with method L-BFGS-B in: optim(x = 
> c(0.059610966029577, 0.0591496321922168, 0.14, 0.18,
> 
> whereas the same calls without the dysfunctional lower= gives the
> warning about `shape` needing to be positive.
> 
> This probably all indicates that something inside optim() is broken.
> 
> 
>  
> 
>>I no longer have 2.1.0 running, so I don't know why this
>>wasn't needed in 2.1.0.
>>
>>"R version 2.2.0, 2005-10-24"
>>MASS version: 7.2-20
>>
>>-peter
>>
>>Gorjanc Gregor wrote:
>>
>>>Dear R developers,
>>>
>>>I have encountered strange behaviour of fitdistr for gamma in recent R
>>>build i.e. 2.2.0. I have attached the code for data at the end of this mail
>>>so you can reproduce the problem. In short, I am able to run fitdistr under
>>>2.1.0 without problems, while I get the following error under 2.2.0
>>>(Version 2.2.0 Patched (2005-11-15 r36348))
>>>
>>>
>>>
>>>>fitdistr(otm, "gamma")
>>>
>>>Error in densfun(x, parm[1], parm[2], ...) : 
>>>'shape' must be strictly positive
>>>
>>>The results on 2.1.1 (Version 2.1.1 (2005-06-20)) are
>>>
>>>
>>>
>>>>fitdistr(otm, "gamma")
>>>
>>>shape   rate  
>>>  1.030667   0.189177 
>>> (0.090537) (0.021166)
>>>
>>>Platform: Windows XP
>>>
>>>Thank you in advance for your effort on this remarkable tool!
>>>
>>>Here is the data for above problem/results:
>>>
>>>"otm" <-
>>>c(0.059610966029577, 0.0591496321922168, 0.14, 0.18, 0.24, 0.25, 
>>>0.270071982912719, 0.270758049933706, 0.269911804412492, 0.280138451903593, 
>>>0.279787947586738, 0.279429937571753, 0.3, 0.320746235495899, 
>>>0.319553311037365, 0.51, 0.54, 0.56, 0.6, 0.609812622915953, 
>>>0.609198293855879, 0.64, 0.69, 0.74, 0.76, 0.770972826186568, 
>>>0.769288654833566, 0.78, 0.789181584270671, 0.78991363293305, 
>>>0.8, 0.89, 0.900691718998831, 0.8991656800583, 0.92, 0.93, 0.94, 
>>>1.01, 1.02, 1.13, 1.18, 1.26, 1.29, 1.33, 1.42, 1.43, 1.47, 
>>>1.47940529614314, 
>>>1.47920716832764, 1.6, 1.61, 1.63, 1.68938231960637, 1.6894849291523, 
>>>1.82, 1.88088044053270, 1.8792804789003, 1.89, 1.92, 2, 2.04, 
>>>2.07, 2.12, 2.17, 2.18, 2.22, 2.23, 2.27, 2.28, 2.3, 2.32092240267433, 
>>>2.31912300181622, 2.38, 2.39, 2.43, 2.46, 2.51, 2.52, 2.55, 2.56, 
>>>2.61, 2.66091404781397, 2.6595832825806, 2.67, 2.7, 2.77, 2.8, 
>>>2.81, 2.86, 2.87, 2.93, 3.01, 3.05, 3.14, 3.15, 3.17, 3.18, 3.24, 
>>>3.26, 3.33, 3.44, 3.45, 3.52, 3.55, 3.63, 3.73, 3.9, 4, 4.01, 
>>>4.04, 4.13, 4.15934497380769, 4.16094719917513, 4.3, 4.33, 4.34, 
>>>4.66, 4.76, 4.82, 4.83, 4.89, 4.92, 5.06, 5.14, 5.16, 5.26, 5.31, 
>>>5.36, 5.48, 5

Re: [Rd] Matrix (PR#8321)

2005-11-17 Thread P Ehlers
Assuming you're on Windows (you didn't say), it looks
like the PACKAGES file in /.../contrib/2.2/ has two
entries for Matrix. Perhaps that's the problem.

Peter


[EMAIL PROTECTED] wrote:

> It appears to me that the new version of the package Matrix will not load to
> R-2.2.0.
> 
> Respectfully,
> 
> Frank Lawrence
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
Peter Ehlers
Department of Mathematics and Statistics
University of Calgary, 2500 University Dr. NW
Calgary, Alberta  T2N 1N4, CANADA

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Matrix (PR#8321)

2005-11-17 Thread P Ehlers
Actually, my guess about PACKAGES was wrong. I just removed Matrix
and re-installed Matrix_0.99-2 (Rgui: Packages menu) from CRAN and
had no problems.

You'll have to be more explicit about "will not load".

Peter


Cougar Lawrence wrote:

> Thanks for the reply.  I am using windows.  I tried both packages.  The
> directions under Matrix indicate that package 99.2 is current.  It is the
> one that will not load.
> 
> Respectfully,
>  
> Frank Lawrence
> 
> 
> -Original Message-
> From: P Ehlers [mailto:[EMAIL PROTECTED] 
> Sent: Thursday, November 17, 2005 16:53
> To: [EMAIL PROTECTED]
> Cc: r-devel@stat.math.ethz.ch; [EMAIL PROTECTED]
> Subject: Re: [Rd] Matrix (PR#8321)
> 
> 
> Assuming you're on Windows (you didn't say), it looks
> like the PACKAGES file in /.../contrib/2.2/ has two
> entries for Matrix. Perhaps that's the problem.
> 
> Peter
> 
> 
> [EMAIL PROTECTED] wrote:
> 
> 
>>It appears to me that the new version of the package Matrix will not 
>>load to R-2.2.0.
>>
>>Respectfully,
>>
>>Frank Lawrence
>>
>>__
>>R-devel@r-project.org mailing list 
>>https://stat.ethz.ch/mailman/listinfo/r-devel
> 
> 

-- 
Peter Ehlers
Department of Mathematics and Statistics
University of Calgary, 2500 University Dr. NW

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "integrate" error using a constant function (PR#8348)

2005-11-25 Thread P Ehlers
Did you check the examples on the help page for integrate?

integrand <- function(x) rep(5, length(x))

should do it. Definitely not a bug.

Peter

[EMAIL PROTECTED] wrote:

> Full_Name: Joel Franklin
> Version: 2.2.0
> OS: WinXP-Prof
> Submission from: (NULL) (63.226.223.22)
> 
> 
> The "integrate" function, when evaluating an integrand function that is 
> constant
> (therefore not a function of the integral) cannot be valuated, and instead
> throws an error. Instead, the interate function should evaluate the constant
> function as a rectangular area with length (upper-lower).
> 
> For example:
> 
> integrand<-function(x){5}
> integrate(f=integrand,lower=1,upper=5)
> 
> "Error in integrate(f = integrand, lower = 1, upper = 5) : 
> evaluation of function gave a result of wrong length"
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
Peter Ehlers
Department of Mathematics and Statistics
University of Calgary, 2500 University Dr. NW   ph: 403-220-3936
Calgary, Alberta  T2N 1N4, CANADA  fax: 403-282-5150

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] suggestion: link oneway.test on kruskal.test page

2005-12-20 Thread P Ehlers
DevelopeRs:

I think it might be useful to add a link to oneway.test() on
the kruskal.test() help page.

("R version 2.3.0, 2005-12-09")

Peter Ehlers
U of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Windows crash in confint() with nls fit (PR#8428)

2005-12-22 Thread P Ehlers
You don't actually need Ben's example. The problem occurs
also for the first example in ?nls if algorithm = "port" is
used.

Peter Ehlers

[EMAIL PROTECTED] wrote:

> Duncan Murdoch wrote:
> 
>>I've found the problem, but someone who knows more about nls() will have 
>>to fix it.
>>
>>The problem is that in the demo code below, n1 ends up being an nls 
>>object, but n1$call$control is NULL.  profiler.nls() assumed that the 
>>nls object passed to it has a non-NULL element there, and doesn't check.
>>
>>I've fixed the code so now it doesn't crash, but it now dies with this 
>>error instead:
>>
>> > confint(n1)  ## boom
>>Waiting for profiling to be done...
>>Error in prof$getProfile() : 'control$maxiter' absent
>>
>>I'll commmit my change to R-devel and R-patched shortly.
>>
>>Duncan Murdoch
>>
> 
> 
>thank you for the quick response!
> 
> actually, I discovered I'm wrong -- bug affects Linux as well,
> gives a segmentation fault
> (I must have tried it without the algorithm="port" argument by
> accident.)  I've looked at the code but I regretfully concur
> that someone else will have to work on this -- I can hack nls
> so it reinserts a "control" element in n1$call, but $tol
> and $minFactor are explicitly deleted from the control list,
> and so we only get one step farther.  I don't know what assumptions
> nls_iter is really making and whether it's safe to make them
> when the port algorithm is being used or not ...
> 
>My best guess at this point, poking around, is that profiler.nls
> in src/library/stats/R/nls-profile.R has to be changed in parallel
> with nls to call port_nlsb instead of nls_iter when the port
> algorithm is being used, but this is all getting a bit deep for
> me ...
> 
> Ben
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
Peter Ehlers
Department of Mathematics and Statistics
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] pbinom with size argument 0 (PR#8560)

2006-02-04 Thread P Ehlers


(Ted Harding) wrote:
> On 03-Feb-06 Peter Dalgaard wrote:
> 
>>(Ted Harding) <[EMAIL PROTECTED]> writes:
>>
>>
>>>On 03-Feb-06 [EMAIL PROTECTED] wrote:
>>>
>>>>Full_Name: Uffe Høgsbro Thygesen
>>>>Version: 2.2.0
>>>>OS: linux
>>>>Submission from: (NULL) (130.226.135.250)
>>>>
>>>>
>>>>Hello all.
>>>>
>>>>  pbinom(q=0,size=0,prob=0.5)
>>>>
>>>>returns the value NaN. I had expected the result 1. In fact any
>>>>value for q seems to give an NaN.
>>>
>>>Well, "NaN" can make sense since "q=0" refers to a single sampled
>>>value, and there is no value which you can sample from "size=0";
>>>i.e. sampling from "size=0" is a non-event. I think the probability
>>>of a non-event should be NaN, not 1! (But maybe others might argue
>>>that if you try to sample from an empty urn you necessarily get
>>>zero "successes", so p should be 1; but I would counter that you
>>>also necessarily get zero "failures" so q should be 1. I suppose
>>>it may be a matter of whether you regard the "r" of the binomial
>>>distribution as referring to the "identities" of the outcomes
>>>rather than to how many you get of a particular type. Hmmm.)
>>>
>>>
>>>>Note that
>>>>
>>>>  dbinom(x=0,size=0,prob=0.5)
>>>>
>>>>returns the value 1.
>>>
>>>That is probably because the .Internal code for pbinom may do
>>>a preliminary test for "x >= size". This also makes sense, for
>>>the cumulative p for any  with a finite range,
>>>since the answer must then be 1 and a lot of computation would
>>>be saved (likewise returning 0 when x < 0). However, it would
>>>make even more sense to have a preceding test for "size<=0"
>>>and return NaN in that case since, for the same reasons as
>>>above, the result is the probability of a non-event.
>>
>>Once you get your coffee, you'll likely realize that you got
>>your p's and d's mixed up...
> 
> 
> You're right about the mix-up! (I must mend the pipeline.)
> 
> 
>>I think Uffe is perfectly right: The result of zero experiments will
>>be zero successes (and zero failures) with probability 1, so the
>>cumulative distribution function is a step function with one step at
>>zero ( == as.numeric(x>=0) ).
> 
> 
> I'm perfectly happy with this argument so long as it leads to
> dbinom(x=0,size=0,prob=p)=1 and also pbinom(q=0,size=0,prob=p)=1
> (which seems to be what you are arguing too). And I think there
> are no traps if p=0 or p=1.
> 
> 
>>>(But it depends on your point of view, as above ... However,
>>>surely the two  should be consistent with each other.)
> 
> 
> Ted.

I prefer a (consistent) NaN. What happens to our notion of a
Binomial RV as a sequence of Bernoulli RVs if we permit n=0?
I have never seen (nor contemplated, I confess) the definition
of a Bernoulli RV as anything other than some dichotomous-outcome
one-trial random experiment. Not n trials, where n might equal zero,
but _one_ trial. I can't see what would be gained by permitting a
zero-trial experiment. If we assign probability 1 to each outcome,
we have a problem with the sum of the probabilities.

Peter Ehlers
> 
> 
> E-Mail: (Ted Harding) <[EMAIL PROTECTED]>
> Fax-to-email: +44 (0)870 094 0861
> Date: 03-Feb-06   Time: 15:07:49
> -- XFMail --
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] read.table / type.convert with NA values

2010-06-29 Thread Peter Ehlers

Is there a compelling reason to have strip.white default
to FALSE? It seems to me that it would be more common to
want the TRUE case.
Having said that, I must confess that I've never had the
problem Erik describes.

  -Peter Ehlers

On 2010-06-29 17:14, Matt Shotwell wrote:

The document RFC 4180 (which appears to be the CSV standard used by R,
see ?read.table) considers spaces to be part of the fielded value. Some
have taken this to mean that all white space characters should be
considered part of the fielded value, though the RFC is not explicit
here. Hence, this behavior is in compliance with the "standard" for CSV
files. It seems that R treats '\t' (and perhaps all?) separated value
files the same way by default.

The RFC is very short and easy to read if you're interested.
http://tools.ietf.org/html/rfc4180

-Matt

On Tue, 2010-06-29 at 16:41 -0400, Erik Iverson wrote:

Hello,

While assisting a fellow R-helper off list, I narrowed down an issue he
was having to the following behavior of type.convert, called through
read.table.  This is using R 2.10.1, if newer versions don't exhibit
this behavior, apologies.

# generates numeric vector
  >  type.convert(c("123.42", "NA"))
[1] 123.42 NA

# generates a numeric vector, notice the space before 123.42
  >  type.convert(c(" 123.42 ", "NA"))
[1] 123.42 NA

# generates factor, notice the space before NA
# note that the 2nd element is actually " NA", not a true NA value
  >  type.convert(c("123.42", " NA"))
[1] 123.42  NA
Levels: 123.42  NA


How can this affect read.table/read.csv use 'in the wild'?

This gentleman had a data file that was

1) delimited by something other than white space, CSV in his case
2) contained missing values, designated by NA in his case
3) contained white space between delimiters and data values, e.g.,

NA, NA,4.5,NA

as opposed to

NA,NA,4.5,NA


With these 3 conditions met, read.table gives type.convert a character
vector like my third example above, and ultimately he got a data.frame
consisting of only factors when we were expecting numeric columns.  This
was easily fixed either by modifying the read.csv function call to
specify colClasses directly, or in his case, strip.white = TRUE did the
job just fine.

I believe the confusion stems from the fact that with no NA values in
our data file, this would work as we would expect.  The introduction of
what we thought were NA values changed the behavior.  In reality, these
were not being treated as NA values by read.table/type.convert.  The
question is, should they be in this case?

This behavior of read.table/type.convert may very well be what is
expected/needed.  If so, this note could still be of use to someone in
the future if they stumble upon similar behavior.  The fact I wasn't
able to uncover anyone who asked about it on list before probably means
the situation is rare.

Best Regards,
Erik Iverson



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Looks like a bug in subsetting of a complicated object

2010-09-01 Thread Peter Ehlers

On 2010-09-01 16:22, Kjetil Halvorsen wrote:

I don't understand what is happening! I have a (large) object sim1, an
matrix list
with dim c(101,101) where each element is an 3*3 matrix. I am
subsetting that with
a matrix coo, of dim c(100,2), of unique indices, but the resulting object
has length 99, not 100 as expected.

Code reproducing the problem follows:


library(RandomFields)

set.seed(123)
sim0<- GaussRF(x=seq(0, 100, by=1),
 y=seq(0, 100, by=1),
 grid=TRUE, model="spherical",
 param=c(0, 1, 0, 10), trend=NULL, n=9, gridtriple=FALSE)

simmatrices<- function(arr) # arr must be an array of rank 3
 {
d<- dim(arr)
n1<- d[1]; n2<- d[2] # we suppose d[3]==9
res<- vector(length=n1*n2, mode="list")
dim(res)<- c(n1, n2)
for (i in 1:n1) for (j in 1:n2) {
  x1<- arr[i, j, 1:3];x2<- arr[i, j, 4:6]
  x3<- arr[i, j, 7:9]
  res[[i, j]]<- x1%o%x1 + x2%o%x2 + x3%o%x3
}
res
  }

sim1<- simmatrices(sim0)

set.seed(234)
x<- sample(seq(0, 100, by=1), 101, replace=TRUE)
y<- sample(seq(0, 100, by=1), 101, replace=TRUE)
coo<- cbind(x=x, y=y)
coo<- unique(coo)

sim1.obs<- sim1[coo]

dim(coo)
length(sim1.obs)


One of the values in coo is zero.

  -Peter Ehlers





sessionInfo()

R version 2.11.1 (2010-05-31)
i686-pc-linux-gnu

locale:
  [1] LC_CTYPE=en_US.utf8   LC_NUMERIC=C
  [3] LC_TIME=en_US.utf8LC_COLLATE=en_US.utf8
  [5] LC_MONETARY=C LC_MESSAGES=en_US.utf8
  [7] LC_PAPER=en_US.utf8   LC_NAME=C
  [9] LC_ADDRESS=C  LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.utf8 LC_IDENTIFICATION=C

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] lattice_0.18-8  gstat_0.9-69sp_0.9-66
[4] RandomFields_1.3.41

loaded via a namespace (and not attached):
[1] fortunes_1.3-7 grid_2.11.1tools_2.11.1




Kjetil



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Assignment of individual values to data frame columns: intentional or unintentional behavior?

2010-09-02 Thread Peter Ehlers

On 2010-08-05 12:14, Ulrike Grömping wrote:

Gabor Grothendieck schrieb:

On Thu, Aug 5, 2010 at 12:24 PM, Ulrike Grömping
  wrote:


Dear developeRs,

I have just discovered a strange feature when assigning some values to
columns of a data frame: The column is matched by partial matching (as
documented), but when assigning a value, a new column with the partial name
is added to the data frame that is identical to the original column except
for the changed value. Is that intentional ? An example:



Note that the lack of partial matching when performing assignment is
also documented.

See second last paragraph in Details section of ?Extract


Yes, I see, thanks. I looked at ?"[.data.frame", where this is not
documented.

However, given the documentation that partial matching is not used on
the left-hand side, I would have expected even more that the assignment

sw$Fert[1]<- 10

works differently, because I am using it on the left-hand side.
Probably, extraction ([1]) is done first here, so that the right-hand
side won. At least, this is very confusing.

Best, Ulrike


This is another example of why it's a good idea to avoid
the '$' notation when fiddling with data frames. Try this:

 sw <- swiss[1:5, 1:4]
 sw[["Fert"]]
 sw[["Fert"]] <- 10

and my preferred version:
 sw[, "Fert"]
 sw[, "Fert"] <- 10

I've never liked partial matching for data frames.

  -Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] postscript failure manifests in plot.TukeyHSD

2010-12-14 Thread Peter Ehlers

On 2010-12-14 09:27, Ben Bolker wrote:

Jari Oksanen  oulu.fi>  writes:



Hello R Developers,

Dear R-developers,

I ran some standard tests with currently (today morning) compiled R release
candidate in Linux R 2.12.1 RC (2010-12-13 r53843). Some of these tests used
plot.TukeyHSD function. This worked OK on the screen (X11 device), but
PostScript file could not be rendered. The following example had the problem
with me:

postscript(file="tukeyplot.ps")
example(plot.TukeyHSD)
dev.off()

I couldn't view the resulting file with evince in Linux nor in the standard
Preview in MacOS. When I compared the generated "tukeyplot.ps" to the same
file generated with an older R in my Mac, I found one difference:

$ diff -U2 oldtukeyplot.ps /Volumes/TIKKU/tukeyplot.ps
--- oldtukeyplot.ps2010-12-14 12:06:07.0 +0200
+++ /Volumes/TIKKU/tukeyplot.ps2010-12-14 12:13:32.0 +0200
@@ -172,5 +172,5 @@
  0 setgray
  0.00 setlinewidth
-[ 3.00 5.00] 0 setdash
+[ 0.00 0.00] 0 setdash
  np
  660.06 91.44 m

Editing the changed line to its old value "[ 3.00 5.00] 0 setdash" also
fixed the problem both in Linux and in Mac. Evidently something has changed,
and probably somewhere else than in plot.TukeyHSD (which hasn't changed
since r51093 in trunk and never in R-2-12-branch). I know nothing about
PostScript so that I cannot say anything more (and I know viewers can fail
with standard conforming PostScript but it is a bit disconcerting that two
viewers fail when they worked earlier).


   I must really be avoiding work today ...

   I can diagnose this (I think) but don't know the best way to
solve it.

   At this point, line widths on PDF devices were allowed to be<1.

==
r52180 | murrell | 2010-06-02 23:20:33 -0400 (Wed, 02 Jun 2010) | 1 line
Changed paths:
M /trunk/NEWS
M /trunk/src/library/grDevices/src/devPS.c

allow lwd less than 1 on PDF device
==

   The behavior of PDF devices (by experiment) is to draw a 0-width
line as 1 pixel wide, at whatever resolution is currently being
rendered.  On the other hand, 0-width lines appear to break PostScript.
(with the Linux viewer 'evince' I get warnings about "rangecheck -15"
when trying to view such a file).

   plot.TukeyHSD  contains the lines

abline(h = yvals, lty = 1, lwd = 0, col = "lightgray")
abline(v = 0, lty = 2, lwd = 0, ...)

   which are presumably meant to render minimum-width lines.

   I don't know whether it makes more sense to (1) change plot.TukeyHSD
to use positive widths (although that may not help: I tried setting
lwd=1e-5 and got the line widths rounded to 0 in the PostScript file);
(2) change the postscript driver to *not* allow line widths<  1 (i.e.,
distinguish between PS and PDF and revert to the pre-r52180 behaviour
for PS only).

   On reflection #2 seems to make more sense, but digging through devPS.c
it's not immediately obvious to me where/how in SetLineStyle or
PostScriptSetLineTexture one can tell whether the current driver
is PS or PDF ...


That may not do it. I find the same problem (fixed by
Jari's replacement of [ 0.00 0.00] with [ 3.00 5.00];
haven't tried anything else yet) when I use pdf()
instead of postscript().
This is on Vista.

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] plotmath indices: suggested addition to help file

2011-01-21 Thread Peter Ehlers

On 2011-01-21 02:27, Martin Maechler wrote:

Thank you, Claudia,


"CB" == Claudia Beleites
 on Thu, 20 Jan 2011 14:05:41 +0100 writes:


 CB>  Dear all, I just stumbled over the fact that subsetting
 CB>  by square bracket will only output the first given
 CB>  index. I guess the rest is thrown away by the CADDR in
 CB>  RenderSub (plotmath.c l. 1399).  Maybe changing this
 CB>  could be considered as "low-priority desired" (would be
 CB>  nice if the output works for ?

I agree this is a  ``missing feature'' and well worth wish list item.

 CB>  However, I suggest to announce the fact that only the
 CB>  first parameter is printed in plotmath.Rd.

 CB>  E.g. in the table l. 72
 CB>   \code{x[i]} \tab x subscript i;  escape further indices (\code{x ["i, 
j"]})\cr

How would get the equivalent of  LaTeX  x_{i_1, j_2}  ?
Not by making it a string (that's not  escape, I'd say),
but by something like

 plot(0, axes=FALSE, main= expression(paste(x[i[1]],{}[j[2]])))

which works +-
but of course is unnecessarily ugly, compared to the desired

 plot(0, axes=FALSE, main= expression(  x[i[1], j[2]]))



I don't know if I've ever disagreed with Martin's advice but,
unless I'm missing something, Claudia wants something done about
the second index in x[i, j] while Martin is talking about the
case of cascading subscripts in 'x_sub_i_sub_1' (as shown in his
LaTeX expression).

Both situations are nicely handled with the 'list()' and '[]'
constructs in plotmath:

  plot(0, axes=FALSE, main= expression( x[ list( i[1], j[2] ) ] ) )

To handle Claudia's wish, it might be desirable to have plotmath.c
automatically recognize such cases but I would consider that to
be (as Claudia says) in the 'nice if' category. Claudia's suggestion
for the help page could be handled by adding another example. Then
again, plotmath (not surprisingly) is like LaTeX in that, the more
you use it, the more you become familiar with the special constructs
needed to get the output you want. I still find myself scurrying to
?plotmath and scanning the Syntax/Meaning table quite frequently.

Peter Ehlers


Martin

 CB>  Claudia

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] possible minor doc clarification?

2011-04-25 Thread Peter Ehlers

On 2011-04-25 08:47, Sean O'Riordain wrote:

Good afternoon,

As a clarification does it make sense to remove the second 'not' in the 'See
Also' documentation for file_test ?


Both versions make sense to me; it's just a question of
whether we think of testing for x 'being a directory'
or for x 'not being a directory'.

The code (for the '-f' op) actually tests !isdir and
so the current wording reflects the code.

Peter Ehlers



Kind regards,
Sean O'Riordain

-
Index: src/library/utils/man/filetest.Rd
===
--- src/library/utils/man/filetest.Rd   (revision 55639)
+++ src/library/utils/man/filetest.Rd   (working copy)
@@ -35,7 +35,7 @@
  }
  \seealso{
\code{\link{file.exists}} which only tests for existence
-  (\code{test -e} on some systems) but not for not being a directory.
+  (\code{test -e} on some systems) but not for being a directory.

\code{\link{file.path}}, \code{\link{file.info}}
  }

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] New code in R-devel: Rao score test for glm.

2011-05-11 Thread Peter Ehlers

On 2011-05-11 07:30, peter dalgaard wrote:


On May 11, 2011, at 15:10 , Brett Presnell wrote:



Thanks for doing this Peter.  I'll have to install the development
version to try this out.

One suggestion though.  I'm pretty confident that plain old "score test"
is a more common terminology than anything involving Rao's name
(econometricians even call it the Lagrange multiplier test).  In light
of this, I think that it would be much better to use test = "score"
rather than test = "Rao".


It's not like that didn't cross my mind, in fact I started out that way, but...

- A column labeled "score" just looks odd, whereas there is some precedence for 
labeling tests according to authors (e.g. Pillai).
- CR Rao is still around, now 90 years of age, and having been taught from "Linear 
Statistical Inference with Applications", I thought paying a little homage would be 
appropriate
- At least the curator calls it "Rao score test": 
http://www.scholarpedia.org/article/Rao_score_test



Yes, thanks, Peter, for coding this test.
As to the name, my vote is for "Rao".

Peter Ehlers





--
Brett Presnell
Department of Statistics
University of Florida
http://www.stat.ufl.edu/~presnell/

"We don't think that the popularity of an error makes it the truth."
   -- Richard Stallman

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Tiny typo in help.Rd

2011-07-13 Thread Peter Ehlers

A recent quote by Bert Gunter from the Details section of help('help')
over on R-help has this (line 82 in help.Rd):

  character string.  There include those which cannot syntactically

where the word 'There' should be 'These'.

(still there in r56374)

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] lattice's mai/mar analog

2009-10-02 Thread Peter Ehlers



Michael Ramati wrote:

hello,
is there a way to control figure margins using package lattice, similarly to 
parameters mai/mar (which presumbly works only for figures of package graphics)?
thanks!‎



Try it this way:

 library(lattice)
 trellis.device()
 trellis.par.set(list(
layout.widths  = list(left.padding = 8, right.padding = 9),
layout.heights = list(top.padding = 10, bottom.padding = 11)))

 xyplot(Sepal.Length ~ Petal.Length | Species, data = iris)

Use

 trellis.par.get()

and check the layout.heights/widths sections. You can adjust
the spacing between the labels and the axis, etc. The paddings
are ordered from extreme left to extreme right. Ditto for top
to bottom.

See

 ?trellis.par.set for how to set up the options.

 -Peter Ehlers



[[alternative HTML version deleted]]





__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] names drop for columns from data.frame (PR#14002)

2009-10-24 Thread Peter Ehlers

Have you tried names(a[,1,drop=FALSE])?
Then have a look at help("[").

-Peter

ve...@clemson.edu wrote:

Full_Name: Francisco Vera
Version: 2.9.2
OS: Windows
Submission from: (NULL) (74.248.242.164)


Run the following commands:

a<-data.frame(x=1:2,y=3:4,row.names=c("i","j"))
names(a$x)
names(a[,1])

For names(a$x) I get NULL instead of c("i","j"). Same thing happens with
names(a[,1]). It works fine for rows, i.e., names(a[1,]) gives what is supposed
to.

It also works fine for matrices. If you issue the commands
b<-matrix(1:4,ncol=2,dimnames=list(c("x","y"),c("i","j")))
names(b[,1])
names(b[1,])

Thanks

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Request: bring back windows chm help support (PR#14034)

2009-11-01 Thread Peter Ehlers

Duncan Murdoch wrote:

On 31/10/2009 6:05 PM, alex...@4dscape.com wrote:

Full_Name: alex galanos
Version: 2.10.0
OS: windows vista
Submission from: (NULL) (86.11.78.110)


I respectfully request that the chm help support for windows, which 
was very
convenient, be reinstated...couldn't an online poll have been 
conducted to gauge

the support of this format by window's users?


First, I don't think that complaints are bugs.
Secondly, why not give the new format a chance. Personally, I
like it. Thanks, Duncan.

 -Peter Ehlers



I don't think it's going to come back, because nobody who knows how to 
bring it back wants to take on the work of maintaining it.  However,
what you might want to do is to contact one of the commercial providers 
of R, and ask them to reinstate it.  They're much more interested in 
market research than R Core is, because their customers pay them for 
their product.  They'd probably be happy to sell you an enhanced R 
supporting CHM help if they think there's a market for it.


Duncan Murdoch

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Request: bring back windows chm help support (PR#14034)

2009-11-01 Thread Peter Ehlers

alexios wrote:

Peter Ehlers wrote:

Duncan Murdoch wrote:

On 31/10/2009 6:05 PM, alex...@4dscape.com wrote:

Full_Name: alex galanos
Version: 2.10.0
OS: windows vista
Submission from: (NULL) (86.11.78.110)


I respectfully request that the chm help support for windows, which 
was very
convenient, be reinstated...couldn't an online poll have been 
conducted to gauge

the support of this format by window's users?


First, I don't think that complaints are bugs.
Secondly, why not give the new format a chance. Personally, I
like it. Thanks, Duncan.

 -Peter Ehlers

It was not a complaint but a simple request, which given the presence of 
a wishlist subdirectory I thought was appropriate to post. Apologies if 
it came across as such.


-Alexios Ghalanos


You're quite right, Alexios, and I do apologize. I think I
jumped too hastily. It was the 'couldn't an online poll..'
part that got to me. The devel version of 2.10 was out for
quite awhile and it seems that there was mostly silence on the
issue.

 -Peter Ehlers



I don't think it's going to come back, because nobody who knows how 
to bring it back wants to take on the work of maintaining it.  However,
what you might want to do is to contact one of the commercial 
providers of R, and ask them to reinstate it.  They're much more 
interested in market research than R Core is, because their customers 
pay them for their product.  They'd probably be happy to sell you an 
enhanced R supporting CHM help if they think there's a market for it.


Duncan Murdoch

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel










__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Request: bring back windows chm help support (PR#14034)

2009-11-01 Thread Peter Ehlers

Gabor Grothendieck wrote:

On Sun, Nov 1, 2009 at 5:55 PM, Duncan Murdoch  wrote:

What is it that you particularly liked about the CHM help?  One thing it did
well was the table of contents at the side, and the built-in search.  I
would like to get those back, in the HTML help.  Is there anything else?



The main thing I miss is the speed. On my Vista system with IE8 it
seems to come up slower than the chm based help.

Has anyone tried it with a text-based browser such as lynx to see if
it runs faster that way?  Is there an option to specify which browser
to use for help?


Gabor, am I right in assuming that the speed problem is for the
first time you use help? I start Firefox whenever I start R. I don't
see much of a speed hit after that.

 -Peter Ehlers


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] The "lib" argument in install.packages().

2009-11-24 Thread Peter Ehlers

Rolf,

If you want to avoid the warning, why not use lib=.libPaths()[1]?
This is not to say that your suggestions aren't useful.

Cheers,
Peter Ehlers

Rolf Turner wrote:


I was flummoxed for a long time by errors generated when I did
something like

install.packages(foo,lib="Rlib")

where ``Rlib'' is my personalized directory of R packages, which
lives in my home directory (from which I started R before issuing
the foregoing install.packages() call.

Recently someone (I forget who, but thanks very much to whomever
it was) pointed out that I needed to specify the complete pathname,
i.e. "/Users/rturner/Rlib" rather than the relative pathname "Rlib"
or "./Rlib" (which I'd also tried).  When the complete pathname is
given the install.packages() call works seamlessly.

Remark:  I have "/Users/rturner/Rlib" as the first entry of my .libPaths(),
so just doing install.packages(foo) works --- but this gives a warning
about lib not being specified, which I find irksome.

Questions:

(1) Why is it that the complete pathname of ``lib'' has to
be specified?  Cannot the code of install.packages() be
adjusted to work with relative pathnames?

(2) If indeed this is not possible, wouldn't it be kind and
helpful, to us young ( :-) ) and naive persons, to put an
indication in the help file for install.packages that the
complete pathname is required?

cheers,

Rolf Turner

P. S. > sessionInfo()
R version 2.10.0 (2009-10-26)
i386-apple-darwin8.11.1

locale:
[1] C

attached base packages:
[1] datasets  utils stats graphics  grDevices methods   base

other attached packages:
[1] misc_0.0-11fortunes_1.3-6 MASS_7.3-3


##
Attention:\ This e-mail message is privileged and confid...{{dropped:9}}

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] `mgp[1:3]' are of differing sign (PR#14130)

2009-12-12 Thread Peter Ehlers


cornell.p.gonsch...@iem.fh-friedberg.de wrote:

Full_Name: Cornell Gonschior
Version: 2.10.0
OS: Linux
Submission from: (NULL) (212.201.28.40)


Hi,

in the introduction to R, you can find the following sentence in the par()
chapter:
"Use tck=0.01 and mgp=c(1,-1.5,0) for internal tick marks."
I thought that's nice, because I wanted to have tick marks and tick labels
inside and the axis title outside.

But:

plot(z, las=1, tck=0.01, mgp=c(1,-1.5,0))

Warnmeldungen:
1: In plot.window(...) : `mgp[1:3]' are of differing sign
2: In plot.xy(xy, type, ...) : `mgp[1:3]' are of differing sign
3: In axis(side = side, at = at, labels = labels, ...) :
  `mgp[1:3]' are of differing sign
4: In axis(side = side, at = at, labels = labels, ...) :
  `mgp[1:3]' are of differing sign
5: In box(...) : `mgp[1:3]' are of differing sign
6: In title(...) : `mgp[1:3]' are of differing sign


par(las=1, tck=0.01, mgp=c(1,-1,0))

Warnmeldung:
In par(las = 1, tck = 0.01, mgp = c(1, -1, 0)) :
  `mgp[1:3]' are of differing sign

Was there a recent change, couldn't find anything useful searching the web.

Regards,
Cornell


Well, it's only a warning, making you aware of a possibly
unintended par setting. Warnings are good things but if you
don't want to see them, they can be suppressed.

Certainly not a bug.

 -Peter Ehlers


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Peter Ehlers
University of Calgary
403.202.3921

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] `mgp[1:3]' are of differing sign (PR#14130)

2009-12-12 Thread Peter Ehlers

Peter Dalgaard wrote:

Peter Ehlers wrote:


cornell.p.gonsch...@iem.fh-friedberg.de wrote:

Full_Name: Cornell Gonschior
Version: 2.10.0
OS: Linux
Submission from: (NULL) (212.201.28.40)


Hi,

in the introduction to R, you can find the following sentence in the 
par()

chapter:
"Use tck=0.01 and mgp=c(1,-1.5,0) for internal tick marks."
I thought that's nice, because I wanted to have tick marks and tick 
labels

inside and the axis title outside.

But:

plot(z, las=1, tck=0.01, mgp=c(1,-1.5,0))

Warnmeldungen:
1: In plot.window(...) : `mgp[1:3]' are of differing sign
2: In plot.xy(xy, type, ...) : `mgp[1:3]' are of differing sign
3: In axis(side = side, at = at, labels = labels, ...) :
  `mgp[1:3]' are of differing sign
4: In axis(side = side, at = at, labels = labels, ...) :
  `mgp[1:3]' are of differing sign
5: In box(...) : `mgp[1:3]' are of differing sign
6: In title(...) : `mgp[1:3]' are of differing sign


par(las=1, tck=0.01, mgp=c(1,-1,0))

Warnmeldung:
In par(las = 1, tck = 0.01, mgp = c(1, -1, 0)) :
  `mgp[1:3]' are of differing sign

Was there a recent change, couldn't find anything useful searching 
the web.


Regards,
Cornell


Well, it's only a warning, making you aware of a possibly
unintended par setting. Warnings are good things but if you
don't want to see them, they can be suppressed.

Certainly not a bug.


Hmm, then again, I tend to agree with Cornell that there are a bit too 
many cases where mgp[1:3]' would sensibly have differing sign, compared 
to cases where it is a mistake. In addition to internal tick marks and 
labels, there are also cases where the whole axis is shifted into the 
plot area. I'd more likely use axis(pos=...) for that, but still.



And, I suppose, if R-Intro mentions it, it would be less confusing
if there were mention of the warning. But use of R conditions one to
seeing warnings as a good thing. Can't count the number of times
I've seen "longer ... is not a multiple of shorter ..." and having
it catch a silly user error.

--
Peter Ehlers
University of Calgary
403.202.3921

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] bug in princomp example (PR#14167)

2009-12-24 Thread Peter Ehlers


twoutop...@gmail.com wrote:

When I run

example(princomp)

I get the following error message:

prncmp> ## The variances of the variables in the
prncmp> ## USArrests data vary by orders of magnitude, so scaling is
appropriate
prncmp> (pc.cr <- princomp(USArrests))  # inappropriate
Error in cov.wt(z) : 'x' must contain finite values only


try rm(x) before you run the example; I think you might
have an 'x' hanging around in your workspace.
I had no trouble with the example run in a clean workspace.

> sessionInfo()
R version 2.10.1 Patched (2009-12-21 r50796)
i386-pc-mingw32

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252
[3] LC_MONETARY=English_Canada.1252 LC_NUMERIC=C
[5] LC_TIME=English_Canada.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] fortunes_1.3-6

loaded via a namespace (and not attached):
[1] tools_2.10.1
>
 -Peter Ehlers



Seth Roberts





--
Peter Ehlers
University of Calgary
403.202.3921

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] loess() crashes R on my system

2010-01-02 Thread Peter Ehlers

Dennis,

I have no problems with this on my DELL laptop (still running
Vista) with R version 2.10.1 Patched (2009-12-21 r50796).

 -Peter Ehlers

Dennis Murphy wrote:

Greetings and happy new year!

I am in the process of converting some of the old S-PLUS scripts from
Visualizing Data (Cleveland, 1993)
into lattice. In fact, I did most of it several years ago, and at the time,
all of the scripts that contained
loess() worked fine. Tonight, I ran most of the scripts again, but every one
that I tried with a
loess() call crashed R. I tried it in two sessions, one with an existing
.Rdata and in another
directory without one. The call below, isolated from its script, is one of
the culprits:

with(ethanol, loess(NOx ~ C * E, span = 1/3, degree = 2,
parametric = "C", drop.square = "C", family="s"))

In response to the Posting Guide, when I say 'crash' I mean that the system
abnormally terminated
R when this line or any other one that invokes loess() is run. I'm running
the binary version of 2.10.1
on a Dell Studio 15 with Windows 7 Home Premium.

This is the first time I've ever experienced something like this in the
decade that I've used R. What's
more surprising is that it occurred with such a common function. Since I
haven't seen any postings
of this type, it's probably something on my end, but I don't know what.
Since it shut down the
program, I felt compelled to report it.

I am subscribed to r-help but not r-devel, so please cc any response to me.
Thanks in advance for
any assistance.

Dennis Murphy


sessionInfo()

R version 2.10.1 (2009-12-14)
i386-pc-mingw32

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

loaded via a namespace (and not attached):
[1] tools_2.10.1

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Peter Ehlers
University of Calgary
403.202.3921

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] log normal overlay

2010-01-04 Thread Peter Ehlers
.80,-65.80,-52.70,-50.20,-64.10,-62.50,-60.40,-55.80,-59.70,-55.80,-59.70,-38.73,-32.09,-37.45,-36.82,-15.27,-27.18,-2

2.!

 
45,-29.09,-51.55,-21.82,-29.55,-31.18,-30.09,-42.27,-39.91,-39.36,-33.73,-46.55,-51.73,-48.55,-46.36,-42.41,-50.31,-40.64,-28.73,17.81,-18.83,-37.83,-33.64,-19.19,-12.83,-48.92,-20.10,-27.01,-33.55,-17.19,-27.31,-29.01,-30.01,-31.41,-39.00,-45.92,-44.76,-47.35,-72.68,-63.45,-56.63,18.33,-25.20,-46.26,-30.33,-30.71,-19.79,-22.54,-5.13,9.02,-19.68,-7.18,-8.22,-17.13,-15.61,-20.18,-21.70,-14.35,-16.31,-49.09,-38.80,-43.93,-35.34,-43.57,-30.58,-7.39,3.21,-1.68,-22.23,-16.06,-18.12,12.40,-37.51,-9.94,-26.96,-14.88,-24.52,-26.91,-38.30,-66.25,-86.42,-82.63,-81.08,-49.81,-1.04,-50.57,-48.79,-8.15,-39.65,-50.39,-51.08,-55.14,-47.35,-44.30,-49.48,-51.00,-67.39,-67.60,-21.89,-30.35,-41.01,-56.76,-26.45,-26.83,-8.39,-16.92,-16.69,-25.99,-34.07,-28.28,-22.49,-13.95,-34.95,-3.85,-0.60,-17.20,1.20,15.20,-5.80,3.30,4.80,-6.10,-6.50,-6.40,0.20,-3.70,0.20,-3.70,13.27,15.91,13.55,11.18,28.73,18.82,21.55,15.91,0.45,19.18,18.45,18.82,14.91,5.73,5.09,-1.36,15.27,7.45,4.27,8.45,10.64,10.59,17.

69!

 ,17.36,26.27,-28.19,
33.17,13.17,21.36,25.81,33.17,9.08,25.90,19.99,19.45,22.81,21.69,20.99,16.59,14.34,8.95,12.55,11.07,-17.82,-6.76,-1.63,3.09,20.52,5.25,16.91,13.79,29.58,28.37,29.92,-6.22,32.44,40.07,37.19,32.55,7.25,33.16,33.16,40.51,38.56,5.78,16.98,12.45,21.05,10.38,25.81,35.28,33.69,47.09,26.54,31.18,29.12,27.64,17.35,42.79,23.33,38.46,22.72,21.86,10.47,-5.29,-31.55,-27.76,-26.22,-5.74,-11.83,-30.88,-1.67,-6.75,4.42,5.87,3.66,-0.40,1.30,4.34,-3.89,0.69,-15.71,-8.96,-27.49,-29.95,-41.61,-49.36,-40.29,-38.23,-25.28,-23.15,-25.36,-33.74,-33.28,-32.67,-29.62,-25.97,-28.73,-33.09,-39.45,-39.82,-35.27,-14.18,-7.45,-26.09,-24.55,-13.82,-2.55,-9.18,-17.09,-19.27,-29.91,-15.36,-14.73,-10.55,-55.73,-78.55,-99.36,-87.64,-50.73,-50.19,-77.83,-43.83,-20.64,-12.19,-7.83,-5.92,7.90,18.99,14.45,-2.19,-2.31,-6.01,-7.01,-10.41,-32.47,-47.01,-58.35,-89.59,-92.63,-52.20,-73.87,-68.61,-65.64,-62.95,-54.11,-36.70,14.03,-42.41,-40.88,-44.06,-53.54,-61.60,-49.09,-52.21,-75.94,-69.96,-83.98,-64.87,-90.85,-83.5

4,!

 -90.07,-79.64,-70.42,-61.85,-70.61,-42.12,-68.59)

breaks<-c(-150,-140,-130,-120,-110,-100,-90,-80,-70,-60,-50,-40,-30,-20,-10,0,10,20,30,40,50,60,70,80,90,100,110,120,130,140,150)
hist(d.resid,breaks,freq=FALSE,main="Field 7: Ground Water Table Depth
Residuals (Avg. - Measured Depth)",xlab="Residuals (cm)", col="grey")
plot(function(d.resid)
dnorm(d.resid,m=-4.007e-15,s=35.571),-150,150,n=400,add=TRUE,col="black",lwd=3)

http://n4.nabble.com/file/n998495/Image2.png 


I would like to do the same thing with the following lines of code, only
this time fit a log-normal distribution.  


app_depths<-c(2.133069498, 1.633840467, 4.946905858, 3.74316825, 2.29580986,
3.156489013, 3.616272192, 20.65750905, 2.878995473, 12.59126936, 14.2215439,
7.892111284, 4.656569671, 3.350734491, 5.457109794, 3.580297328,
12.66226362, 4.491926672, 7.1, 13.5, 11.1, 14.1, 7.3, 8.6, 3.5, 5, 7.1, 7,
15.1, 9.1, 6.5, 13.4, 15.2, 10.5, 11.8, 8.8, 8, 9, 20.4, 15.6, 14.6, 7.1,
3.6, 4.8, 6.4, 10.1, 9.5, 24.5, 16.1, 6.5, 3.3, 3.7, 3.3, 5.3, 4, 2.4, 5.2,
3.7, 3.5, 7, 5.6, 5.9, 8.2, 5.2, 12.7, 6.3, 6.6, 1.9, 2.2, 4.1, 4.7, 
7.9, 14.4, 16.2, 8.8, 14.1, 12.7, 15.1, 20.2, 4.9, 7.9, 5, 7.9, 4.6, 5.7,

32.3, 7.7, 16, 13.9, 16.8, 7.7, 13.6, 5.1, 11.4, 4.59, 7.35, 4.14, 3.82,
1.95, 3.06, 7.4, 8.8, 11, 2, 6.6, 7, 5.7, 5.8, 6.2, 2.9, 7.8, 5.7, 12.2,
5.8, 5.8, 26.7, 22, 13.5, 7.9, 5.64, 2.03, 1.7, 2.78, 5.78, 2.76, 24.6, 15,
12.32, 23.79, 14.3, 11.7, 9.1, 9.4, 8.24, 7.1, 5.9, 4.6, 4.3, 2.96, 7.9,
11.13, 5.3, 5.8, 1.7, 7.12, 8.3, 7.83, 6.62, 42.67, 34.76, 41.68, 6.07,
4.43, 1.2, 1.8, 1.1, 1.1, 1.3, 1.46, 1.46, 10.31, 2.96, 
22.17, 2.75, 3.6, 9.8, 9.68, 5.8, 7.23, 10.06, 4.117439286, 11.63456013,

9.92, 7.40, 9.78, 4.37, 7.12, 7.57, 27.03, 3.43, 6.71, 7.74, 10.01, 7.78,
14.95, 9.98, 8.07, 10.73, 8.82, 15.08, 12.72, 4.27, 10.07, 6.45, 16.67,
16.28, 7.21, 21.46, 21.52, 14.93, 9.03, 26.70, 14.90, 6.12, 17.88, 13.72)

breaks<-c(0,2,4,6,8,10,12,14,16,18,20,22,24,26,28,30,32,34,36,38,40,42,44)
hist(app_depths,breaks,freq=FALSE,main="LARV Irrigation Application Depths
(in) 2004-2007",xlab="Application Depth (in)",col="grey")
plot(function(app_depths)
dlnorm(app_depths,meanlog=mean(app_depths),sdlog=sd(app_depths)),0,44,add=TRUE,col="black",lwd=3)


Don't you need meanlog=mean(log()), etc?
I usually use curve():

hist(app_depths,breaks,freq=FALSE)
curve(dlnorm(x, meanlog=mean(log(app_depths)),
   sdlog=sd(log(app_depths))), add=TRUE, col=1, lwd=3)

 -Peter Ehlers


Problem is, I get a plot that doesn't look right,

http://n4.nabble.com/file/n998495/Image3.png 
 
If there is an obvious error as to why the pdf line doesn't fit the data,

I'm unable to find it.  Any help would be greatly appreciated.

Respectfully,
Eric


--
Peter Ehlers
University 

Re: [Rd] Confusing error message for [[.factor (PR#14209)

2010-02-09 Thread Peter Ehlers

g.russ...@eos-solutions.com wrote:

Full_Name: George Russell
Version: 2.10.0 and 2.11.0 Under development (unstable) (2010-02-08 r51108)
OS: Windows
Submission from: (NULL) (217.111.3.131)



c("a","b")[[c(TRUE,FALSE)]]
Error in `[[.default`(factor(c("a", "b")), c(TRUE, FALSE)) : 
  recursive indexing failed at level 1


I find this error message confusing, though after reading the HELP carefully I
think I know what is going on. Would not something like "[[ does not work with
logical index vectors" be more appropriate?


It didn't take particularly careful reading to find this:

 "The most important distinction between [, [[ and $ is that
  the [ can select more than one element whereas the other two
  select a single element."

Try this:

 c("a","b")[[c(1,2)]]
 c("a","b")[[TRUE]]

 -Peter Ehlers



sessionInfo is (for 2.11) :
R version 2.11.0 Under development (unstable) (2010-02-08 r51108) 
i386-pc-mingw32 


locale:
[1] LC_COLLATE=German_Germany.1252  LC_CTYPE=German_Germany.1252   
LC_MONETARY=German_Germany.1252
[4] LC_NUMERIC=CLC_TIME=German_Germany.1252


attached base packages:
[1] stats graphics  grDevices datasets  utils methods   base

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Unexpected behaviour of x[i] when i is a matrix, on Windows

2010-02-12 Thread Peter Ehlers

You're comparing 2.10.0 on Windows with 2.11.0 on Linux.
Have you tried 2.11.0 on Windows? => same result as on Linux.

 -Peter Ehlers

Wolfgang Huber wrote:

Hi,

when running the following on different instances of R (Linux and 
Windows), I get different results. The one for Linux seems to be the 
intended / documented one. When using numeric indices rather than 
characters, Windows seemed to behave as expected.


---On Windows--

x = matrix(FALSE, nrow=3, ncol=3)
colnames(x) = LETTERS[1:3]
rownames(x) = letters[1:3]
x

#   A B C
# a FALSE FALSE FALSE
# b FALSE FALSE FALSE
# c FALSE FALSE FALSE

x [ cbind("b", "B") ] = TRUE
x
  b B
# FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE

sessionInfo()

R version 2.10.0 (2009-10-26)
i386-pc-mingw32

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base


---On Linux--
x = matrix(FALSE, nrow=3, ncol=3)
colnames(x) = LETTERS[1:3]
rownames(x) = letters[1:3]
x
#   A B C
# a FALSE FALSE FALSE
# b FALSE FALSE FALSE
# c FALSE FALSE FALSE
x [ cbind("b", "B") ] = TRUE
x
#   A B C
# a FALSE FALSE FALSE
# b FALSE  TRUE FALSE
# c FALSE FALSE FALSE

 > sessionInfo()
R version 2.11.0 Under development (unstable) (2010-02-12 r51125)
x86_64-unknown-linux-gnu

locale:
 [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=C  LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=en_US.UTF-8   LC_NAME=C
 [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C

attached base packages:
[1] stats graphics  grDevices datasets  utils methods   base

other attached packages:
[1] fortunes_1.3-7






--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] aggregate: with 2 by variables in the result the 2nd by-variable is wrong (PR#14213)

2010-02-12 Thread Peter Ehlers

franz.quehenber...@medunigraz.at wrote:

Full_Name: Franz Quehenberger
Version: 2.10.1
OS: Windows XP
Submission from: (NULL) (145.244.10.3)


aggregate is supposed to produce a data.frame that contains a line for each
combination  of levels of the variables in the by list. The first columns of the
result contain these combinations of levels. With two by variables the second
by-variable takes always only one value. However, it works fine with one or
three by-variables.

The problems seems to be caused by this line of code in aggregate():

w <- as.data.frame(w, stringsAsFactors = FALSE)[which(!unlist(lapply(z,
is.null))), , drop = FALSE]

or more specifically by: 


[which(!unlist(lapply(z, is.null))), , drop = FALSE]

Kind regards
FQ



# demonstration of the aggregate bug ind R 2.10.1
factor.a=rep(letters[1:3],4)
factor.b=rep(letters[4:5],each=3,times=2)
factor.c=rep(letters[4:5+2],each=6)
data=data.frame(factor.a,factor.b,factor.c,x)
x=1:12
#one by-variable works:
aggregate(x,list(a=factor.a),FUN=mean)
#thre by-variable work fine:
aggregate(x,list(a=factor.a,b=factor.b,c=factor.b),FUN=mean)
#two by-variables do not produce the levels of the second by-variable
correctly:
aggregate(x,list(a=factor.a,b=factor.b),FUN=mean)
# data
print(data)

Result of the R code:



# demonstration of the aggregate bug ind R 2.10.1
factor.a=rep(letters[1:3],4)
factor.b=rep(letters[4:5],each=3,times=2)
factor.c=rep(letters[4:5+2],each=6)
data=data.frame(factor.a,factor.b,factor.c,x)
x=1:12
#one by-variable works:
aggregate(x,list(a=factor.a),FUN=mean)

  a   x
1 a 5.5
2 b 6.5
3 c 7.5

#thre by-variable work fine:
aggregate(x,list(a=factor.a,b=factor.b,c=factor.b),FUN=mean)

  a b c x
1 a d d 4
2 b d d 5
3 c d d 6
4 a e e 7
5 b e e 8
6 c e e 9

#two by-variables do not produce the levels of the second by-variable

correctly:

aggregate(x,list(a=factor.a,b=factor.b),FUN=mean)

  a b x
1 a d 4
2 b d 5
3 c d 6
4 a d 7
5 b d 8
6 c d 9
Warnmeldung:
In data.frame(w, lapply(y, unlist, use.names = FALSE), stringsAsFactors = FALSE)
:
  row names were found from a short variable and have been discarded

# data
print(data)

   factor.a factor.b factor.c  x
1 adf  1
2 bdf  2
3 cdf  3
4 aef  4
5 bef  5
6 cef  6
7 adg  7
8 bdg  8
9 cdg  9
10aeg 10
11beg 11
12ceg 12



I don't see this is 2.10.1 nor in 2.11.0 (Windows Vista).
I can't think of how you might have got your result.
Is there something you haven't mentioned?
What's your sessionInfo()?

--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Minor typo in ?.Primitive

2010-04-01 Thread Peter Ehlers

The help page for .Primitive has this line:

## start quote
This function is almost never used: get(name, envir=basenv()) works 
equally well and 

## end quote

basenv() should be baseenv().

Checked for r51392 and r51520.


 -Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] packages with DLLs under 2.12.0

2010-04-02 Thread Peter Ehlers

I realize that R-core must be busy with the imminent release of
2.11.0, so please consider this not urgent.

The NEWS file for 2.12.0 (Windows-specific) says, in part:

  For now, 32-bit packages with compiled code built under
  2.{10,11}.x can be used, but this will be disabled before
  release.

For me, this doesn't work without a tweak. For example,

> library(mvtnorm)
#Error: package 'mvtnorm' is not installed for 'arch=i386'

I tested a few (6 or 7) other randomly selected packages
(bitops, igraph, ...) before wising up and looking at the
recommended and automatically installed packages like
lattice, etc. I see that the 'libs' folder now has an
'i386' subfolder where the DDLs reside.

Making that change by hand fixes the 'problem'.

> sessionInfo()
R version 2.12.0 Under development (unstable) (2010-03-31 r51520)
i386-pc-mingw32

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252
[3] LC_MONETARY=English_Canada.1252 LC_NUMERIC=C
[5] LC_TIME=English_Canada.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

other attached packages:
[1] mvtnorm_0.9-9

loaded via a namespace (and not attached):
[1] tools_2.12.0

--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Deferred Default Marker

2010-04-23 Thread Peter Ehlers

Terry,

I don't see the problem in R 2.11.0 or R 2.10.1 Patched
(session info for R 2.10.1 below) with Windows (Vista).
I do get warnings about kinship having been built under
R 2.11.0 when I use R 2.10.1.

But I notice that your version of kinship looks somewhat
dated. Is that intentional?

  -Peter Ehlers

> sessionInfo()
R version 2.10.1 Patched (2010-01-05 r50896)
i386-pc-mingw32

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252
[3] LC_MONETARY=English_Canada.1252 LC_NUMERIC=C
[5] LC_TIME=English_Canada.1252

attached base packages:
[1] splines   stats graphics  grDevices utils datasets  methods
[8] base

other attached packages:
[1] kinship_1.1.0-23 lattice_0.18-3   nlme_3.1-96  survival_2.35-8

loaded via a namespace (and not attached):
[1] grid_2.10.1


On 2010-04-23 9:20, Terry Therneau wrote:

I've finally narrowed down a puzzling problem: here is the short test
case.

tmt34% R --vanilla

R version 2.10.0 (2009-10-26)
Copyright (C) 2009 The R Foundation for Statistical Computing
ISBN 3-900051-07-0


temp<- matrix(runif(50), ncol=2)
t(temp) %*% temp

  [,1] [,2]
[1,] 7.916016 6.049698
[2,] 6.049698 7.650694


library(kinship)

Loading required package: survival
Loading required package: splines
Loading required package: nlme
Loading required package: lattice

t(temp) %*% temp

`__Deferred_Default_Marker__`

-

Within the library is a definition of %*% for bdsmatrix objects, which
is perhaps the issue.  But I'm only guessing since I don't have a clear
idea what the error message means.  Any hints are appreciated.

The new coxme/bsdmatrix packages need only a couple more functions to be
a complete replacement for kinship, at which point we will depreciate
it. (Pedigrees and plotting were finished last weekend!)  But I can't
quite do that yet. The error does not arise with these newer libraries.
There have been no changes to kinship for some time.

   Terry Therneau



sessionInfo()

R version 2.10.0 (2009-10-26)
x86_64-unknown-linux-gnu

locale:
  [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
  [3] LC_TIME=en_US.UTF-8LC_COLLATE=C
  [5] LC_MONETARY=C  LC_MESSAGES=en_US.UTF-8
  [7] LC_PAPER=en_US.UTF-8   LC_NAME=C
  [9] LC_ADDRESS=C   LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C

attached base packages:
[1] splines   stats graphics  grDevices utils datasets
methods
[8] base
other attached packages:
[1] kinship_1.1.0-11 lattice_0.17-26  nlme_3.1-96  survival_2.35-9

loaded via a namespace (and not attached):
[1] grid_2.10.0




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Peter Ehlers
University of Calgary

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Location of source code for readline()

2010-05-30 Thread Peter Ehlers

On 2010-05-30 15:33, Prof. John C Nash wrote:

A few days ago on R-help I asked about a cross-platform timeout version of 
readline().
Some suggestions, but only partial joy so far. I can get the Gnu bash  'read -t 
...' to
work in Windows by using the 'bash -c ' construct, but then R's system() 
function does not
seem to allow this to pass through. Similarly a Perl and Free Pascal routine 
that I tried,
the latter being a single executable that did the prompt and the timeout. (I 
can send code
offline if anyone interested -- not fully protected against bad inputs, 
however.)

Now I'm wondering where the code for readline is located in the R source. I've 
tracked as
far as the 'do_readln' in names.c, but now want to find the actual code to see 
if I can
patch it, though I am a real novice in C. Suggestions welcome.

My application, FYI, is to have a script that will display something, and wait 
for a
keypress (for readline it seems to need the Enter key) but timeout after a 
preset number
of seconds. The setTimeLimit "almost" works -- but not for readline. I'm 
thinking of a
modified readline like readline(prompt='Do you want to continue?', timeout=6).

Note that the issue seems to be Windows. I haven't a Mac to try, but Linux can 
be made to
function by various methods at the top.  Sigh.

JN



I can't help with your project, but check scan.c for do_readln.

 -Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] html help fails for named vector objects (PR#9927)

2007-09-23 Thread P Ehlers
There seems also to be a difference between the way 'help()' and '?'
handle 'topic' in some cases.

Consider:
lm <- "aov"

The following all bring up help for 'lm':
?lm
?"lm"
help("lm")

This opens help for 'aov':
help(lm)

It seems that "?" doesn't care about quoting, but "help" does.
Did I miss something in the docs?
(If it matters, I'm using options(chmhelp=TRUE).)

 > sessionInfo()
R version 2.6.0 beta (2007-09-18 r42895)
i386-pc-mingw32

locale:
LC_COLLATE=English_Canada.1252; [snip]

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

  - Peter Ehlers

Prof Brian Ripley wrote:
> On a normal R help system any version of help(letters) tries to give you 
> help on a, b, etc.  This is intentional (although the documentation is 
> unclearly worded), and has been the case since at least R 2.0.0.
> Some versions of help will display only the first topic.
> 
> If help(letters, htmlhelp=FALSE) does something else on your system, that 
> is a MacOS-specific bug since the topic selected should not depend on the 
> optional arguments.
> 
> What is unclear from the documentation is what should happen with
> 
> help(c("help", "help.search"))
> 
> It seems that for text help you get the first, for htmlhelp both.
> 
> There are two things we could do to help (apart from clarifying the 
> documentation):
> 
> 1) If 'topic' is neither a name nor a character vector (e.g. an expression 
> like the last example) give an explicit error.
> 
> 2) if topic is a character vector of length > 1, use the name.
> 
> 
> On Sun, 23 Sep 2007, [EMAIL PROTECTED] wrote:
> 
>>   help(letters, htmlhelp=TRUE) fails.
>>
>> Under the Mac OSX gui, the message is 'Help for the topic "a" was not
>> found.' Under the version documented below, and under Windows, the
>> message is
>>
>>   "No documentation for 'a' in specified packages and libraries:"
>> repeated for all the elements of letters, then followed by
>>   "you could try 'help.search("a")'",
>> again repeated for all elements of letters.
>>
>> The outcome seems similar for any character vector (including matrix)
>> object, e.g. the matrix 'primateDNA' in the DAAGbio package.
>>
>> The following have the expected result
>>   help("letters", htmlhelp=TRUE)
>>   help(letters, htmlhelp=FALSE)
> 
> But the documented and actual results are the same, and different in the 
> two cases.
> 
>> The same result is obtained with R-2.5.1.
>>
>>
>> --please do not edit the information below--
>>
>> Version:
>> platform = i386-apple-darwin8.10.1
>> arch = i386
>> os = darwin8.10.1
>> system = i386, darwin8.10.1
>> status = beta
>> major = 2
>> minor = 6.0
>> year = 2007
>> month = 09
>> day = 22
>> svn rev = 42941
>> language = R
>> version.string = R version 2.6.0 beta (2007-09-22 r42941)
>>
>> Locale:
>> C
>>
>> Search Path:
>> .GlobalEnv, package:testpkg, package:stats, package:graphics,
>> package:grDevices, package:utils, package:datasets, package:methods,
>> Autoloads, package:base
>>
>> John Maindonald email: [EMAIL PROTECTED]
>> phone : +61 2 (6125)3473fax  : +61 2(6125)5549
>> Centre for Mathematics & Its Applications, Room 1194,
>> John Dedman Mathematical Sciences Building (Building 27)
>> Australian National University, Canberra ACT 0200.
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Choose.files to give shortnames

2006-03-22 Thread P Ehlers
Would

basename(choose.files())

work for your usage?
Still, a full.names argument might be useful.

Peter Ehlers

Brooks, Anthony B wrote:

> Hi
> I am writing a script to generate a QC report for some data based on a number 
> of files. I am currently using the choose.files function to select the files 
> as not all the files need to be QC'd. One minor issue is that choose.files 
> returns the complete path of the filename (eg "C:/QCdata/Test1/File01", 
> "C:/QCdata/Test1/File02"...). Is it possible to use choose.files to return 
> just the file name eg("File01", "File02"...), in a way similar to the 
> full.names=FALSE argument in the dir function.
> 
> Thanks
> Tony
> 
> 
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] wishlist: 'formula' method for stripchart()

2006-04-05 Thread Peter Ehlers
Folks,

I would find it useful to have a formula method for
stripchart() with 'data' and 'subset' arguments, similar
to boxplot.formula() whose code can probably be adapted
fairly easily.

Comments?

Peter Ehlers

(Win XP)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] wishlist: 'formula' method for stripchart()

2006-04-05 Thread Peter Ehlers
Sorry, I should have been clearer. This is definitely not
a high-priority item and I hadn't intended it to be
included for 2.3.0.

Sure, I'll provide the code and help page adjustment.

Peter Ehlers


Prof Brian Ripley wrote:
> On Wed, 5 Apr 2006, Peter Ehlers wrote:
> 
>> Folks,
>>
>> I would find it useful to have a formula method for
>> stripchart() with 'data' and 'subset' arguments, similar
>> to boxplot.formula() whose code can probably be adapted
>> fairly easily.
>>
>> Comments?
> 
> 
> Are you offering to provide one?
> 
> (Actually now is not a good time unless you have one in your back pocket 
> as we are 5 days off feature freeze for 2.3.0.)
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R-2.3.0alpha-win32: Rgui and MDI/SDI

2006-04-07 Thread Peter Ehlers
I find that Rgui defaults to SDI if I specify
   MDI = yes
   toolbar = no
in Rconsole. Hope I didn't miss something in NEWS/CHANGES.

R.version.string
[1] "Version 2.3.0 alpha (2006-04-05 r37653)"
Win XP

Peter Ehlers

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] (PR#8777) strsplit does [not] return correct value when spliting ""

2006-04-17 Thread Peter Ehlers
Charles,

Can't you achieve your goal by unlist()ing 'substrings'?

  max(nchar(unlist(substrings)))

Peter Ehlers


Charles Dupont wrote:
> Now using R 2.3.0.
> 
> I have a string that can be "".  I want to find the max screen width of 
> the all the lines in the string. so I run the command
> 
>   > x <- c("hello", "bob is\ngreat", "foo", "", "bar")
>   > substrings <- strsplit(x, "\n"), type="width")
>   > sapply(substrings, FUN=function(x) max(nchar(x, type="width")))
> which returns
> [1]563 -Inf3
> 
> This happens because of the behavior of strsplit for a string that is not ""
>   > strsplit("Hello\nBob", "\n")
> 
> it returns
> [[1]]
> [1] "Hello" "Bob"
> 
> 
> for a string that is ""
>   > strsplit("", "\n")
> 
> it returns
> [[1]]
> character(0)
> 
> 
> I would expect
> [[1]]
> [1] ""
> 
> because "" is character vector of length 1 containing a string of length 
> 0, not a character vector of length 0.
> 
> For any other string if the split string is not matched in argument x 
> then it returns the original string x.
> 
> The man page states in the value section that strsplit returns:
>   A list of length 'length(x)' the 'i'-th element of which contains
>   the vector of splits of 'x[i]'.
> 
> It mentions no change in behavior if the value of x[i] = "".
> 
> Prof Brian Ripley wrote:
> 
>>Please use a current version of R: we are at 2.3.0RC (and we do ask you 
>>not to report on obselete versions).
>>
>>What rule are you using, and where did you find it in the R documentation?
>>
>>In fact
>>
>>
>>>strsplit("", " ")
>>
>>[[1]]
>>character(0)
>>
>>which is not as you stated.   This is a feature, as it distinct from
>>
>>
>>>strsplit(" ", " ")
>>
>>[[1]]
>>[1] ""
>>
>>Consider also
>>
>>
>>>strsplit("", "")
>>
>>[[1]]
>>character(0)
>>
>>
>>>strsplit("a", "")
>>
>>[[1]]
>>[1] "a"
>>
>>
>>>strsplit("ab", "")
>>
>>[[1]]
>>[1] "a" "b"
>>
>>
>>On Mon, 17 Apr 2006, [EMAIL PROTECTED] wrote:
>>
>>
>>>Full_Name: Charles Dupont
>>>Version: 2.2.0
>>>OS: linux
>>>Submission from: (NULL) (160.129.129.136)
>>>
>>>
>>>when
>>>
>>>strsplit("", " ")
>>>
>>>returns character(0)
>>>
>>>where as
>>>
>>>strsplit("a", " ")
>>>
>>>returns "a".
>>>
>>>these return values are not constiant with each other.
>>>
>>>Charles Dupont
>>>
>>>__
>>>R-devel@r-project.org mailing list
>>>https://stat.ethz.ch/mailman/listinfo/r-devel
>>>
>>>
>>
> 
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] bug in rbind.data.frame with factors (PR#8868)

2006-05-16 Thread Peter Ehlers
How is this a bug? From the help page for cbind/rbind:

Description
Take a sequence of vector, matrix or data frames arguments and
combine by _columns_ or _rows_, respectively.
(emphasis added)

Note that it does _not_ say "combine by variable names".

Peter Ehlers

[EMAIL PROTECTED] wrote:

> Full_Name: Rafal Kustra
> Version: 2.1.1
> OS: Linux, MacOS 10.3
> Submission from: (NULL) (69.195.47.62)
> 
> 
> When Rbinding two data frames with factors, strange result occur (but no 
> error)
> when the order of data frame variables is different in two data frames:
> 
> 
>>d1=as.data.frame(list(x=1:10,y=letters[1:10]))
>>d2=as.data.frame(list(y=LETTERS[1:5],x=7:11))
>>d2
> 
>   y  x
> 1 A  7
> 2 B  8
> 3 C  9
> 4 D 10
> 5 E 11
> 
>>rbind(d1,d2)
> 
> xy
> 1   1a
> 2   2b
> 3   3c
> 4   4d
> 5   5e
> 6   6f
> 7   7g
> 8   8h
> 9   9i
> 10 10j
> 11  7 
> 21  8 
> 31  9 
> 41 10 
> 51 11 
> Warning message:
> invalid factor level, NAs generated in: "[<-.factor"(`*tmp*`, ri, value = 
> c("A",
> "B", "C", "D", "E")) 
> 
> 
> Things work correctly when the order of variables is the same:
> 
> 
>>d3=as.data.frame(list(x=7:11,y=LETTERS[1:5]))
>>rbind(d1,d3)
> 
> x y
> 1   1 a
> 2   2 b
> 3   3 c
> 4   4 d
> 5   5 e
> 6   6 f
> 7   7 g
> 8   8 h
> 9   9 i
> 10 10 j
> 11  7 A
> 21  8 B
> 31  9 C
> 41 10 D
> 51 11 E
> 
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Numerical error in R (win32) (PR#8909)

2006-05-29 Thread Peter Ehlers
Did you check the Details section of the help page for round()?

Peter Ehlers

[EMAIL PROTECTED] wrote:

> Hi
> I had observed the following problem in R (also C, Matlab, and Python).
> sprintf('%1.2g\n', 3.15)
> give 3.1 instead of 3.2 whereas an input of 3.75 gives 3.8.
> Java's System.out.printf is ok though.  
>  
> 
>>round(3.75,1)
> 
> [1] 3.8
> 
>>round(3.15,1)
> 
> [1] 3.1
>  
> Similar outcome with sprintf in R.
> 
> 
> However, the right answer should be 3.2
>  
> Regards
> Teckpor
>  
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] plot.new, trellis and windows plot history (PR#8935)

2006-06-03 Thread P Ehlers
Andrew,

Why are you calling plot.new()?

Peter Ehlers


[EMAIL PROTECTED] wrote:
> Full_Name: Andrew Hooker
> Version: 2.3.1
> OS: windows xp sp2
> Submission from: (NULL) (83.253.8.162)
> 
> 
> Hi,
> 
> I think there is a bug in the windows graph history procedure, here is what
> happens:
> 
> 1. open R
> 2. 'library(lattice)'
> 3. 'xyplot(0~0)'
> 4. Turn on recording in the graphical window
> 5. Add plot to history
> 4. 'plot.new()'
> 5. 'xyplot(1~1)'
> 
> Try to 'pageup' to the first plot.  I can't get the first plot to appear, and 
> if
> I turn off the graphics device using 'dev.off()' I get a warning message:
> 
> 1: Display list redraw incomplete 
> 
> Any ideas how to fix this?
> 
> Andrew Hooker
> Uppsala University
> Sweden
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel