Re: [Rd] [R] How to convert "c:\a\b" to "c:/a/b"

2005-07-04 Thread Spencer Graves
ANOTHER EXAMPLE FOR "gsub"?

  I would like to suggest that some version of Prof. Ripley's answer to 
my recent post to r-help (subject as above) be added to the "Examples" 
for "gsub":

## CAUTION:  XEmacs may hang on "readLines"
## unless submitted by itself
(x <- readLines(stdin(), n=1))
D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt
(x <- gsub("", "/", x))

ANOTHER OPTION FOR "readLines"?

  I don't know if this issue is sufficiently important to justify 
adding an argument to have "readLines" to change "\" to "/", but if so, 
it does not seem difficult:

readLines. <-
function (con = stdin(), n = -1, ok = TRUE,
   reverseBackSlash=FALSE)
{
 if (is.character(con)) {
 con <- file(con, "r")
 on.exit(close(con))
 }
 if(reverseBackSlash){
   dat <- .Internal(readLines(con, n, ok))
   return(gsub("", "/", dat))
 }
 .Internal(readLines(con, n, ok))
}

(x <- readLines.(stdin(), n=1, reverseBackSlash=TRUE))
D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt

  This has been tested (in this one example) under XEmacs / ESS and 
Rgui for R 2.1.1 patched.

  Thanks for your great support of the R project and through that 
making it much easier for people to learn and use improved statistical 
methods and to advance the science even further.

  Best Wishes,  
  spencer graves

Spencer Graves wrote:

> Thank You, Prof. Ripley!
> 
>   Both "test1.R" and "test2.R" worked for me just now, as did the 
> following minor modification:
> 
> (x <- readLines(stdin(), n=1))
> D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt
> 
>   Thanks again.
> 
>   spencer graves
> 
> Prof Brian Ripley wrote:
> 
>> On Wed, 29 Jun 2005, David Duffy wrote:
>>
>>
>>> I couldn't resist adding a more literal answer
>>
>>
>>
>> This can only work for escapes which are preserved.  The parser maps
>> \n to a character (LF) and the deparser maps it back to \n.
>> This happens to be true of \a \b \f \n \r \t \v \\ but no others.
>>
>> For example, \s is mapped to s, and there is no difference between \s and
>> s in the parsed input.
>>
>>
>>> unback <- function(x) {
>>> chars <- unlist(strsplit(deparse(x),""))
>>> chars <- chars[-c(1,length(chars))]
>>> paste(gsub("","/",chars),collapse="")
>>> }
>>>
>>> unback("\n")
>>
>>
>>
>>> unback("\s")
>>
>>
>> [1] "s"
>>
>> Spencer Graves keeps on insisting there is a better way, but all the
>> solutions are to avoid sending the string to the parser, and hence
>> avoiding having the string directly in an R script.  This is common in 
>> shell scripts, which use 'here' documents to avoid 'quoting hell'.
>>
>> We can do that in R too. Here are two variants I have not seen in the 
>> thread
>>
>> test1.R:
>> scan("", "", allowEscapes=FALSE, n=1, quiet=TRUE)
>> D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt
>> catIx, "\n", sep="")
>>
>> R --slave --vanilla < test1.R
>> D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt
>>
>> (This one does not allow quoted strings.)
>>
>> test2.R:
>> x <- readLines(stdin(), n=1)
>> "D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt"
>> x <- gsub('^"(.*)"$', "\\1", x)
>> cat(x, "\n")
>>
>> R --slave --vanilla < test2.R
>> D:\spencerg\dataPOWER\stats\Tukey\Boxplot_missing_Tukey2.txt
>>
>> (This one allows surrounding quotes or not.)
>>
> 

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com <http://www.pdf.com>
Tel:  408-938-4420
Fax: 408-280-7915

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] evaluating variance functions in nlme

2005-07-27 Thread Spencer Graves
  Yes, this probably should go to R-help, and before you do that, I 
suggest you PLEASE do read the posting guide! 
http://www.R-project.org/posting-guide.html.  It can increase the 
likelihood you will get a useful reply relatively quickly.

  I tried the your two calls to "gnls", which of course produced errors 
for me as I don't have your data.  The help page for "gnls" included an 
example that looked quite similar to me:

 >  fm1 <- gnls(weight ~ SSlogis(Time, Asym, xmid, scal), Soybean,
+  weights = varPower())
 >  summary(fm1)>  fm1 <- gnls(weight ~ SSlogis(Time, Asym, 
xmid, scal), Soybean,
+  weights = varPower())
 >  summary(fm1)

  Those results plus "str(fm1)" looked like they might help answer your 
question.  However, I don't understand enough of what you are asking to 
say.

  If an answer might still be worth pursuing to you, I suggest you read 
the posting guide and submit a question following that model to r-help.

  spencer graves

Nicholas Lewin-Koh wrote:

> Hi,
> I guess this is a final plea, and maybe this should go to R-help but
> here goes.
> I am writing a set of functions for calibration and prediction, and to
> calculate standard 
> errors and intervals I need the variance function to be evaluated at new
> prediction points.
> So for instance
> 
> fit<-gnls(Y~SSlogis(foo,Asym,xmid,scal),weights=varPower())
> fit2<-gnls(Y~SSlogis(foo,Asym,xmid,scal),weights=varPower(form=~foo))
> 
> Now using fit or fit2 I would like to get the variance function
> evaluated at new points. 
> I have played with getCovariateFormula, and looked at Initialize.gnls,
> summary etc.
> but it is not clear to me how to evaluate the form component, especially
> in the case of fit
> above where form=~fitted(.), in any safe way.
> 
> I can grep for "fitted" in the formula eg.
> grep("fitted",deparse(getCovariateFormula(fit$modelStruct$varFunc)))
> and try to calculate predicted values from the model for the new points
> but how to substitute back in the new terms?
> 
> I don't need this problem solved on a platter, I just need to unedrstand
> an approach,
> because my stabs are failing.
> 
> Thanks
> 
> Nicholas
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com <http://www.pdf.com>
Tel:  408-938-4420
Fax: 408-280-7915

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] qbinom returns NaN

2005-11-23 Thread Spencer Graves
Hi, All:

  For most but not all cases, qbinom is the inverse of pbinom. 
Consider the following example, which generates an exception:

 > (pb01 <- pbinom(0:1, 1, .5, log=T, lower.tail=FALSE))
[1] -0.6931472   -Inf

  Since "lower.tail=FALSE", Pr{X>1} = 0 in this context, and log(0) = 
-Inf, consistent with the documentation.

  However, the inverse of this does NOT recover 0:1:

 > qbinom(pb01,1, .5, log=T, lower.tail=F)
[1]   0 NaN

  Shouldn't the NaN here be 1?  If yes, this is relatively easy to fix. 
  Consider for example the following:

qbinom. <-
function (p, size, prob, lower.tail = TRUE, log.p = FALSE){
   q. <- .Internal(qbinom(p, size, prob, lower.tail, log.p))
   q.[p==(-Inf)] <- 1
   q.
}
 > qbinom.(pb01,1, .5, log=T, lower.tail=F)
[1] 0 1
Warning message:
NaNs produced in: qbinom(p, size, prob, lower.tail, log.p)

  It's also easy to eliminate the Warning.  Consider for example the 
following:

qbinom. <-
function (p, size, prob, lower.tail = TRUE, log.p = FALSE){
   if(any(p.inf <- p==(-Inf))&&(!lower.tail)&&log.p){
 n <- max(length(p), length(size), length(prob))
 p <- rep(p, length=n)
 size <- rep(size, length=n)
 prob <- rep(prob, length=n)
 q. <- size
 q.[p>(-Inf)] <- .Internal(qbinom(p[!p.inf],
 size[!p.inf], prob[!p.inf], lower.tail, log.p))
 return(q.)
   }
   .Internal(qbinom(p, size, prob, lower.tail, log.p))
}

  I suspect that for the right person, it would likely be easy to fix 
this in the .Internal qbinom code.  However, that's beyond my current R 
skill level.

      Thanks for all your efforts to make R what it is today.
  Best Wishes,
  spencer graves

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com <http://www.pdf.com>
Tel:  408-938-4420
Fax: 408-280-7915

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] typo in `eurodist'

2005-12-08 Thread Spencer Graves
  I'm with Martin:  When I get the same number of hits for two 
spellings, I believe that both are acceptable.  When I get substantially 
different numbers of hits, I generally go with the one with the most 
hits -- unless the different spellings carry different meanings, of 
course.

  Example:  "gage" vs. "gauge" vs. "guage":  24e6 vs. 32e6 vs 3e6.  The 
last is a typo.  The first has a special meaning, though "gauge" is 
sometimes used in that context.  However, when discussing repeatability 
and reproducibility, I prefer "gage", because it's more restrictive and 
therefore seems clearer to me.

  spencer graves

Martin Maechler wrote:

>>>>>>"Torsten" == Torsten Hothorn <[EMAIL PROTECTED]>
>>>>>>on Thu, 8 Dec 2005 08:51:57 +0100 (CET) writes:
> 
> 
> Torsten> On Wed, 7 Dec 2005, Prof Brian Ripley wrote:
> 
> >> I've often wondered about that.
> 
> Torsten> and the copy editor did too :-)
> 
> >> I've presumed that the names were
> >> deliberate, so have you checked the stated source?  It's not readily
> >> available to me (as one would expect in Oxford)?
> 
> Torsten> our library doesn't seems to have a copy of `The Cambridge
> Torsten> Encyclopaedia', so I can't check either. Google has 74.900 hits 
> for
> Torsten> `Gibralta' (more than one would expect for a typo, I think)
> Torsten> and 57.700.000 for `Gibraltar'.
> 
> Torsten> So maybe both spellings are in use.
> 
> Well,  do you expect web authors to have a much lower rate of
> typos than 1:770 ?
> My limited experience on "google voting for spelling correction"
> has rather lowered my expectation on webauthors' education in
> orthography...
> 
> Martin
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
Spencer Graves, PhD
Senior Development Engineer
PDF Solutions, Inc.
333 West San Carlos Street Suite 700
San Jose, CA 95110, USA

[EMAIL PROTECTED]
www.pdf.com <http://www.pdf.com>
Tel:  408-938-4420
Fax: 408-280-7915

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] nls profiling with algorithm="port" may violate bounds (PR#8508)

2006-01-20 Thread Spencer Graves
Hi, Ben, et al.:

  The issue Ben identified with confint(nls(... )) generates a hard 
failure for me.  Specifically the command "confint(m1)" in his script 
below under Rgui 2.2.1 first says, "Waiting for profiling to be done..." 
then forces a screen to pop up with heading "R for Windows GUI 
front-end" reading, "R for Windows GUI front-end has encountered a 
problem and needs to close.  We are sorry for the inconvenience.  If you 
were in the middle of something, the information you were working on 
might be lost... ."  When I try the same thing running R under XEmacs 
with ESS, I get essentially the same response, exceppt "R for Windows 
GUI" is replaced by "R for Windows terminal".  In both cases, it kills 
R.  In both cases, sessionInfo() before this command is as follows:

 > sessionInfo()
R version 2.2.1, 2005-12-20, i386-pc-mingw32

attached base packages:
[1] "stats4""methods"   "stats" "graphics"  "grDevices" "utils"
[7] "datasets"  "base"

  I'm running Windows XP professional version 5.1 on an IBM T30 
notebook computer.

  Thanks to all of the R Core team for all your hard work to make R 
what it is today, with these kinds of unpleasant surprises to rare.

  Best Wishes,
  spencer graves

[EMAIL PROTECTED] wrote:

>   [posted to R-devel, no discussion:
> resubmitting it as a bug, just so it gets
> logged appropriately]
> 
>Sorry to report further difficulties with
> nls and profiling and constraints ... the problem
> this time (which I didn't check for in my last
> round of testing) is that the nls profiler doesn't
> seem to respect constraints that have been
> set when using the port algorithm.
> See test code below ...
> If I can I will try to hack the code, but I will
> probably start by redefining my function with
> some workarounds to make the fit quadratically "bad" (but well-defined)
> when the parameters are negative ...
>  As always, please don't hesitate to correct me
> if I'm being an idiot ...
> 
> cheers
>   Ben Bolker
> 
> ---
> rm(list=ls())
> 
> npts=10
> set.seed(1001)
> 
> a =2
> b =0.5
> x= runif(npts)
> y = a*x/(1+a*b*x)+rnorm(npts,sd=0.2)
> 
> gfun <- function(a,b,x) {
> if (a<0 || b<0) stop("bounds violated")
> a*x/(1+a*b*x)
> }
> 
> m1 = nls(y~gfun(a,b,x),algorithm="port",
> lower=c(0,0),start=c(a=1,b=1))
> 
> try(confint(m1))
> 
> 
> for what it's worth, the logic appears to be OK in mle in the stats4
> library:
> --
> library(stats4)
> 
> mfun <- function(a,b,s) {
> if (a<0 || b<0 || s<0) stop("bounds violated")
> -sum(dnorm(y,a*x/(1+a*b*x),sd=s,log=TRUE))
> }
> 
> m2 = mle(minuslogl=mfun,
> start=list(a=1,b=1,s=0.1),
> method="L-BFGS-B",lower=c(0.002,0.002,0.002))
> 
> confint(m2)
> 
> m2b = mle(minuslogl=mfun,
> fixed=list(b=0),start=list(a=1,s=0.1),
> method="L-BFGS-B",lower=c(0.002,0.002,0.002))
> ## set boundary slightly above zero to avoid
> ## boundary cases
> 
> dev <- 2*(-logLik(m2b)+logLik(m2))
> as.numeric(pchisq(dev,lower.tail=FALSE,df=1))
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] nls profiling with algorithm="port" may violate bounds (PR#8508)

2006-01-21 Thread Spencer Graves
Dear Professors. Bolker & Ripley:

   Thank you both very much for all your creativity and hard work 
both in your generall contributions to human knowledge and specifically 
for helping make R the great thing it is today.  I had not seen a reply 
to that email in several days, so I made time to check it out.  When I 
replicated the error, I thought it my duty to report same.  I know 
that's more appropraite with "r-help" than with "r-devel", but I thought 
such a comment might help someone.  I certainly did NOT want to add to 
someones' workload a requirement to reply to my comment.

   Thanks again,
   spencer graves

Ben Bolker wrote:

> Spencer Graves  pdf.com> writes:
> 
> 
>>Hi, Ben, et al.:
>>
>>The issue Ben identified with confint(nls(... )) generates a hard 
>>failure for me.  
> 
> 
>   "We" (being Brian Ripley and I) know about this already.
>   I'm sorry I failed to specify enough info in my bug report,
> but I was using R-devel/2.3.0 of (I think?) 11 January, under Linux.
> Your problem is actually PR #8428, which is fixed enough to prevent
> a crash in 2.2.1 patched and really fixed in 2.3.0, all thanks to
> Brian Ripley.
> 
>   cheers
> Ben
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Searching R Packages

2018-01-27 Thread Spencer Graves

Hello, All:


	  Might you have time to review the article I recently posted to 
Wikiversity on "Searching R Packages" 
(https://en.wikiversity.org/wiki/Searching_R_Packages)?



	  Please edit this yourself or propose changes in the associated 
"Discuss" page or in an email to this list or to me.



	  My goal in this is to invite readers to turn that article into a 
proposal for improving the search capabilities in R that would 
ultimately be funded by, e.g., The R Foundation.



  What do you think?


  Please forward this to anyone you think might be interested.


	  Thanks for your contributions to improving the lot of humanity 
through better statistical software.



  Best Wishes,
  Spencer Graves, PhD
  Founder
  EffectiveDefense.org
  7300 W. 107th St. # 506
  Overland Park, KS 66212

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

[Rd] help(pac=xxx) get data from CRAN?

2018-01-29 Thread Spencer Graves

Hello, All:


  A feature request:


  What about modifying the code for "help(pac=xxx)" so it looks for 
"xxx" on CRAN as well as locally:  If it finds it locally, it uses that 
version but with an annotation reporting what it found on CRAN:


* If it doesn't find CRAN, it doesn't report anything on this.
* If it finds CRAN, and the version numbers match, it reports that.
* If it finds CRAN, and the local version is behind CRAN, it says, 
"newer version on CRAN".
* If it finds CRAN, and the local version is ahead of CRAN, it says 
something like, "Local newer than CRAN."



  This occurred to me, because someone suggested I update the "sos" 
package to use CRAN to get package information for packages not already 
installed.  It's a great idea, but I'm not ready to do that just yet.



  Best Wishes,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

[Rd] Draft proposal for Searching R Packages

2018-02-17 Thread Spencer Graves

Hello, All:


  I just posted a "Draft Proposal for improving the ability of R 
users to search R packages" to Wikiversity 
(https://en.wikiversity.org/wiki/Draft_Proposal_for_improving_the_ability_of_R_users_to_search_R_packages). 




  You are all invited to rewrite it in any way you think is more 
likely to produce the most useful result.  Wikimedia invites 
contributors to "be bold but not reckless", writing from a neutral point 
of view citing credible sources.  I do NOT want to do this project:  I 
think the world will be better if it is done, and I think others are 
better equipped to actually do it -- or manage others doing it -- than I am.



  If you read this, you will see that it contains critical gaps.  I 
hope one or more of you will fill these critical gaps or help find 
others who will.



  As indicated there, the next major deadline is April 1.  This 
sounds like lots of time, except that the key thing that is missing in 
this draft proposal is principal investigator(s).  Without PI(s), it 
won't fly.



      Thanks,
  Spencer Graves, PhD
  Founder
  EffectivedDefense.org
  7300 W. 107th St. # 506
  Overland Park, KS 66212
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R Bug: write.table for matrix of more than 2, 147, 483, 648 elements

2018-04-18 Thread Spencer Graves



On 2018-04-18 17:38, Steven McKinney wrote:

Hi Colton,

You could divide your write task into chunks that do not violate the 2^31-1 
limit.

write.table has an append argument (default FALSE).

Figure out a row chunk size nri < nr such that nri * nc is under 2^31-1 and use
write.table() to write that out.

Then use
write.table(  append = TRUE, )
for the next chunk of rows, looping over chunks until done.  Two chunks will 
get your 2.8 billion entries done.



  Magnificent:  Is that something that could be implemented inside 
write.table?



  Spencer


Best

Steve



Steven McKinney, Ph.D.

Statistician
Molecular Oncology and Breast Cancer Program
British Columbia Cancer Research Centre





-Original Message-
From: R-devel [mailto:r-devel-boun...@r-project.org] On Behalf Of Tousey,
Colton
Sent: April-18-18 2:08 PM
To: r-c...@r-project.org; simon.urba...@r-project.org; R-devel@r-
project.org
Subject: [Rd] R Bug: write.table for matrix of more than 2, 147, 483, 648
elements

Hello,

I want to report a bug in R that is limiting my capabilities to export a
matrix with write.csv or write.table with over 2,147,483,648 elements (C's
int limit). I found this bug already reported about before: https://bugs.r-
project.org/bugzilla/show_bug.cgi?id=17182. However, there appears to be no
solution or fixes in upcoming R version releases.

The error message is coming from the writetable part of the utils package
in the io.c source code(https://svn.r-
project.org/R/trunk/src/library/utils/src/io.c):
/* quick integrity check */
 if(XLENGTH(x) != (R_len_t)nr * nc)
 error(_("corrupt matrix -- dims not not match
length"));

The issue is that nr*nc is an integer and the size of my matrix, 2.8
billion elements, exceeds C's limit, so the check forces the code to fail.

My version:

R.Version()

$platform
[1] "x86_64-w64-mingw32"

$arch
[1] "x86_64"

$os
[1] "mingw32"

$system
[1] "x86_64, mingw32"

$status
[1] ""

$major
[1] "3"

$minor
[1] "4.3"

$year
[1] "2017"

$month
[1] "11"

$day
[1] "30"

$`svn rev`
[1] "73796"

$language
[1] "R"

$version.string
[1] "R version 3.4.3 (2017-11-30)"

$nickname
[1] "Kite-Eating Tree"

Thank you,
Colton


Colton Tousey
Research Associate II
P: 816.585.0300   E: colton.tou...@kc.frb.org
FEDERAL RESERVE BANK OF KANSAS CITY
1 Memorial Drive   *   Kansas City, Missouri 64198   *
www.kansascityfed.org

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Spam to R-* list posters

2018-04-19 Thread Spencer Graves



On 2018-04-19 09:40, Martin Maechler wrote:

Serguei Sokol 
 on Thu, 19 Apr 2018 13:29:54 +0200 writes:

  [...]

 > Thanks Tomas for this detailed explanation.

 > I would like also to signal a problem with the list. It must be
 > corrupted in some way because beside the Tomas'  response I've got five
 > or six (so far) dating spam. All of them coming from two emails:
 > Kristina Oliynik  and
 > Samantha Smith .


Well, that's the current ones for you.  They change over time,
and in my experience you get about 10--20 (about once per hour;
on purpose not exactly every 60 minutes) and then it stops.

I've replied to the thread  "Hacked" on R-help yesterday:
   https://stat.ethz.ch/pipermail/r-help/2018-April/452423.html

This has started ca 2 weeks ago on R-help already, and today
we've learned that even  R-SIG-Mixed-Models  is affected.

I think I don't see them anymore at all because my spam filters have adapted.

Note that

1. This is *NOT* from regular mailing list subscribers, and none
of these spam come via the R mailing list servers.

2. It's still a huge pain and disreputable to the R lists of course.

3. I had hoped we could wait and see it go away, but I may be wrong.

4. We have re-started discussing what could be done.

One drastic measure would make mailing list usage
*less* attractive by "munging" all poster's e-mail addresses.

-

For now use your mail providers spam filters to quickly get rid
of this. .. or more interestingly and clearly less legally: use R to
write "mail bombs".  Write an R function sending ca 10 e-mails per
hour randomly to that address   ... ;-)  I did something like
that (with a shell script, not R) at the end of last millennium
when I was younger and the internet was a much much smaller
space than now...



  What about implementing "Mailhide", described in the Wikipedia 
article on "reCAPTCHA"?



  '[F]or example, "mai...@example.com" would be converted to 
"mai...@example.com". The visitor would then click on the "..." and 
solve the CAPTCHA in order to obtain the full email address. One can 
also edit the pop-up code so that none of the address is visible.' 
(https://en.wikipedia.org/wiki/ReCAPTCHA)



  Of course, this is easier for me to suggest, because I'm not in a 
position to actually implement it ;-)



  Spencer Graves


p.s.  I wish again to express my deep appreciation to Martin and the 
other members of the R Core team who have invested so much time and 
creativity into making The R Project for Statistical Computing the 
incredible service it is today.  A good portion of humanity lives better 
today, because of problems that would not otherwise have been addressed 
as well as they have been without some important analysis done with R.



Martin

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] svg ignores cex.axis in R3.5.1 on macOS

2018-08-31 Thread Spencer Graves
  Plots produced using svg in R 3.5.1 under macOS 10.13.6 ignores 
cex.axis=2.  Consider the following:



> plot(1:2, cex.axis=2)
> svg('svg_ignores_cex.axis.svg')
> plot(1:2, cex.axis=2)
> dev.off()
> sessionInfo()
R version 3.5.1 (2018-07-02)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.6

Matrix products: default
BLAS: 
/Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRblas.0.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets  methods base

loaded via a namespace (and not attached):
[1] compiler_3.5.1


  ** The axis labels are appropriately expanded with the first 
"plot(1:2, cex.axis=2)".  However, when I wrote that to an svg file and 
opened it in other applications (GIMP and Safari), the cex.axis request 
was ignored.  This also occurred inside RStudio on my Mac. It worked 
properly using R 3.2.1 under Windows 7.



  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] svg ignores cex.axis in R3.5.1 on macOS

2018-08-31 Thread Spencer Graves




On 2018-08-31 14:21, Spencer Graves wrote:
Plots produced using svg in R 3.5.1 under macOS 10.13.6 ignores 
cex.axis=2.  Consider the following:



> plot(1:2, cex.axis=2)
> svg('svg_ignores_cex.axis.svg')
> plot(1:2, cex.axis=2)
> dev.off()
> sessionInfo()
R version 3.5.1 (2018-07-02)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS High Sierra 10.13.6

Matrix products: default
BLAS: 
/Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRblas.0.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.5/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets  methods base

loaded via a namespace (and not attached):
[1] compiler_3.5.1


  ** The axis labels are appropriately expanded with the first 
"plot(1:2, cex.axis=2)".  However, when I wrote that to an svg file 
and opened it in other applications (GIMP and Safari), the cex.axis 
request was ignored.  This also occurred inside RStudio on my Mac. It 
worked properly using R 3.2.1 under Windows 7.



I just confirmed that when I created a file like this under Windows 7 
and brought it back to my Mac, it displayed fine.  I have not tried this 
with the current version of R under Windows 7 nor an old version of R on 
my Mac.  Thanks.  Spencer



  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] svg ignores cex.axis in R3.5.1 on macOS

2018-09-06 Thread Spencer Graves



On 2018-09-06 05:17, Prof Brian Ripley wrote:
> On 06/09/2018 10:47, peter dalgaard wrote:
>> I think this needs to be taken off the bug repository and continued 
>> here. By now it seems pretty clear that this is not an R bug, but a 
>> local problem on Spencer's machine, likely connected to font 
>> configurations.
>
> Or even on R-sig-Mac.
>
>> I poked around a bit on the three Macs that I can access, and found 
>> that fc-match does different things, including throwing warnings, 
>> hanging and even crashing my old MB Air...
>>
>> One possible reason is that it can apparently be installed in 
>> multiple locations, for reasons lost in the mists of time:
>>
>> Peters-iMac:BUILD-dist pd$ ls -l /opt/local/bin/fc-*
>> -rwxr-xr-x  1 root  wheel  44072 Apr  5  2014 /opt/local/bin/fc-cache
>> -rwxr-xr-x  1 root  wheel  43444 Apr  5  2014 /opt/local/bin/fc-cat
>> -rwxr-xr-x  1 root  wheel  34480 Apr  5  2014 /opt/local/bin/fc-list
>> -rwxr-xr-x  1 root  wheel  34928 Apr  5  2014 /opt/local/bin/fc-match
>> -rwxr-xr-x  1 root  wheel  34480 Apr  5  2014 /opt/local/bin/fc-pattern
>> -rwxr-xr-x  1 root  wheel  34008 Apr  5  2014 /opt/local/bin/fc-query
>> -rwxr-xr-x  1 root  wheel  34448 Apr  5  2014 /opt/local/bin/fc-scan
>> -rwxr-xr-x  1 root  wheel  38780 Apr  5  2014 /opt/local/bin/fc-validate
>> Peters-iMac:BUILD-dist pd$ ls -l /opt/X11/bin/fc-*
>> -rwxr-xr-x  1 root  wheel  58128 Oct 26  2016 /opt/X11/bin/fc-cache
>> -rwxr-xr-x  1 root  wheel  57600 Oct 26  2016 /opt/X11/bin/fc-cat
>> -rwxr-xr-x  1 root  wheel  48384 Oct 26  2016 /opt/X11/bin/fc-list
>> -rwxr-xr-x  1 root  wheel  48992 Oct 26  2016 /opt/X11/bin/fc-match
>> -rwxr-xr-x  1 root  wheel  44256 Oct 26  2016 /opt/X11/bin/fc-pattern
>> -rwxr-xr-x  1 root  wheel  44000 Oct 26  2016 /opt/X11/bin/fc-query
>> -rwxr-xr-x  1 root  wheel  44288 Oct 26  2016 /opt/X11/bin/fc-scan
>> -rwxr-xr-x  1 root  wheel  48608 Oct 26  2016 /opt/X11/bin/fc-validate
>> Peters-iMac:BUILD-dist pd$ ls -l /usr/local/bin/fc-*
>> -rwxr-xr-x@ 1 root  wheel  1463900 Oct 21  2008 /usr/local/bin/fc-cache
>> -rwxr-xr-x@ 1 root  wheel  1459780 Oct 21  2008 /usr/local/bin/fc-cat
>> -rwxr-xr-x@ 1 root  wheel  1455628 Oct 21  2008 /usr/local/bin/fc-list
>> -rwxr-xr-x@ 1 root  wheel  1476560 Oct 21  2008 /usr/local/bin/fc-match
>>
>> Notice that these are all different, no links. I guess that the ones 
>> you want are in /opt/X11, presumably installed by XQuartz.
>
> Yes, for the device compiled into the CRAN binary R package. (Other 
> builds may differ.)  On that, the cairo-based devices such as svg() 
> are linked to (current versions on my machine)
>
> /usr/lib/libz.1.dylib (compatibility version 1.0.0, current 
> version 1.2.5)
> /opt/X11/lib/libcairo.2.dylib (compatibility version 11403.0.0, 
> current version 11403.6.0)
> /opt/X11/lib/libpixman-1.0.dylib (compatibility version 35.0.0, 
> current version 35.0.0)
> /opt/X11/lib/libfontconfig.1.dylib (compatibility version 11.0.0, 
> current version 11.2.0)
> ...
>
>
>> So, going out on a limb, I have two ideas:
>>
>> (A) Rebuild the font cache with
>>
>> /opt/X11/bin/fc-cache -vf
>>
>> (B) Check that XQuartz is up to date (possibly reinstall it, even if 
>> it is)
>
> (B) is expected to do (A).  My advice was going to be to reinstall 
> xquartz: macOS updates can partially break it.


   I was going to try that, but I rebooted (again), and now it's 
working.


   I rebooted before I first reported the problem, and I've rebooted 
a couple of times since without success.  This time was different, I 
don't know why.  Before I rebooted this time, I saw "XQuartz" on my 
taskbar / "Dock", switched to it, then clicked on the XQuartz icon in 
upper left and selected "About X11".  This said "XQuartz 2.7.11 
(xorg-server 1.18.4)."  Then I rebooted and restarted RStudio then tried 
svg again with cex.axis=2, and it worked.  Moreover, a web search took 
me to "https://xquartz.en.softonic.com/mac";, which says that the current 
XQuartz for Mac is 2.6.1.  Since I now have 2.7.11 and it's working, I 
think I should leave it alone.


   If anyone wants me to try something further to add to this 
record, I will.  Otherwise, I'll wait:  If the problem recurs, I'll try 
reinstalling XQuartz again, as Professors Dalgaard and Ripley 
suggested.  And if I have another problem with svg and need further 
help, I will consider R-sig-Mac.


   Thanks also to Paul Murrell, who provided several responses to my 
(non)-bug report.


   Spencer Graves
>
>>
>> -pd
>>
>>> O

Re: [Rd] Stability and reliability of R (comment on Re: as.vector() broken on a matrix or array of type "list")

2018-09-26 Thread Spencer Graves




On 2018-09-26 10:32, MacQueen, Don via R-devel wrote:

With regard to Martin's  comment about the strength of (base) R:

I have R code I wrote 15+ years ago that has been used regularly ever since 
with only a few minor changes needed due to changes in R. Within that code, I 
find particularly impressive for its stability a simple custom GUI that uses 
the tcltk package that has needed no updates whatsoever in all that time.

Such stability and reliability have been extremely valuable to me.



  How much of R's stability is due to the unit tests encouraged by 
the examples in the help pages, the vast majority of which are run 
repeatedly with each new change?



  More generally, what are the lessons the computer science 
discipline can take from R's experience in this regard?



  I discussed this eight years ago in an article on "Package 
development process"  that I posted to Wikipedia eight years ago that 
has attracted 9 views per day since.  I also added a table discussing 
this to the Wikipedia article on "Software repository". That article has 
attracted over 300 views per day for at least the past 3 years.  Both 
these articles could doubtless be improved by someone more knowledgeable 
than I.



  Many thanks and kudos to Ross Ihaka, Bob Gentleman, Martin 
Maechler and the rest of the R Core team, who have managed this project 
so successfully for more than two decades now.



  Spencer Graves


-Don

--
Don MacQueen
Lawrence Livermore National Laboratory
7000 East Ave., L-627
Livermore, CA 94550
925-423-1062
Lab cell 925-724-7509
  
  


On 9/26/18, 12:41 AM, "R-devel on behalf of Martin Maechler" 
 wrote:

[-- most of original message omitted, so as to comment on the following --]
 
 -

 *) {Possibly such an R we would create today would be much closer to
 julia, where every function is generic / a multi-dispach method
 "a la S4"  and still be blazingly fast, thanks to JIT
 compilation, method caching and more smart things.}
 But as you know one of the strength of (base) R is its stability
 and reliability.  You can only use something as a "the language
 of applied statistics and data science" and rely that published
 code still works 10 years later if the language is not
 changed/redesigned from scratch every few years ((as some ... are)).
 
 __

 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] point size in svg

2019-06-19 Thread Spencer Graves

Hello, All:


  I'm unable to control the font size in plots to svg.  Consider 
the following:



svg('cex-svg.svg')
cex. <- 5
plot(1:2, cex.axis=cex.)
text(1:2, 1:2, c('as', 'DF'),
  cex=cex.)
dev.off()


  When I open this in Gimp 2.10.4, the font size is tiny.  I also 
tried:



svg('cex-svg.svg', width=15, height=15, pointsize=24)
cex. <- 5
plot(1:2, cex.axis=cex.)
text(1:2, 1:2, c('as', 'DF'),
  cex=cex.)
dev.off()


      What do I do to control the font size in svg?


  Thanks,
  Spencer Graves


sessionInfo()
R version 3.6.0 (2019-04-26)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Mojave 10.14.5

Matrix products: default
BLAS: 
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets
[6] methods   base

other attached packages:
[1] Ecdat_0.3-2 Ecfun_0.2-1

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] point size in svg

2019-06-23 Thread Spencer Graves

  Thanks to Peter Langfelder and David Winsemius for their replies.


  It must be Apple specific.  I transferred the R script with the 
svg files between my Mac and a Windows 7 machine.  The svg files created 
on the Windows machine displayed properly on both machines. The svg 
files created on the Mac showed the same tiny fonts on both machines.



  Following David's suggestion, I tried 
svg('cex-svg-Helvetica.svg', family="Helvetica") and 
svg('cex-svg-serif.svg', family="serif") with otherwise the same script 
as before with the same results.



  I will repost this to r-sig-mac R .


  Thanks again to Peter and David.
  Spencer Graves


#  From the Windows system:
sessionInfo()
R version 3.5.2 (2018-12-20)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1

Matrix products: default

locale:
  [1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
  [1] stats graphics  grDevices utils datasets
[6] methods   base

loaded via a namespace (and not attached):
  [1] compiler_3.5.2 tools_3.5.2    yaml_2.2.0


On 2019-06-19 11:32, David Winsemius wrote:


On 6/19/19 8:19 AM, Spencer Graves wrote:

Hello, All:


  I'm unable to control the font size in plots to svg. Consider 
the following:



svg('cex-svg.svg')
cex. <- 5
plot(1:2, cex.axis=cex.)
text(1:2, 1:2, c('as', 'DF'),
  cex=cex.)
dev.off()


  When I open this in Gimp 2.10.4, the font size is tiny.  I also 
tried:



svg('cex-svg.svg', width=15, height=15, pointsize=24)
cex. <- 5
plot(1:2, cex.axis=cex.)
text(1:2, 1:2, c('as', 'DF'),
  cex=cex.)
dev.off()


  What do I do to control the font size in svg?



I'm unable to reproduce. (I get very large fonts in all three viewing 
methods: Rstudio plot panel, ImageViewer and GIMP)


Ubuntu 18.04

R 3.15.2

Gimp 2.10.12


Looking at ?svg makes me think you should be looking at the Cairo 
Fonts section of ?X11.



Best;

David.




  Thanks,
  Spencer Graves


sessionInfo()
R version 3.6.0 (2019-04-26)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Mojave 10.14.5

Matrix products: default
BLAS: 
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets
[6] methods   base

other attached packages:
[1] Ecdat_0.3-2 Ecfun_0.2-1

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R-Forge > GitHub?

2019-06-26 Thread Spencer Graves
Hello, All:


   What's the status and future plans for R-Forge?


   I ask primarily because a problem I reported May 15 and 17 via 
two different channels has yet to be fixed, and it prevents my 
development versions of the Ecdat and Ecfun packages from building -- 
because the Windows version cannot find "Matrix";  see below. 
Secondarily, the version of R that R-Forge tried to use earlier today 
was 3.5.3 -- NOT the current version.


   Assuming you recommend migrating to GitHub, do you have a 
preferred procedure?  I found 
"https://gist.github.com/friendly/7269490".  This says it was "Last 
active 2 years ago" but seems to be the most current advice I can find 
on this right now.  That looks complicated, but I assume it preserves 
the edit history on R-Forge. ???


       Thanks,
       Spencer Graves


 Forwarded Message 
Subject:Error : package 'Ecfun' could not be loaded
Date:   Fri, 17 May 2019 18:41:12 -0500
From:   Spencer Graves 
To: r-fo...@r-project.org



Hello:


   Your Windows platform cannot find "Matrix" and other packages.  See:


https://r-forge.r-project.org/R/?group_id=1439&add_log=check_x86_64_windows&pkg=Ecdat&flavor=patched&type=00install.out


   I reported this to your Support tracker two days ago:


https://r-forge.r-project.org/tracker/?atid=194&group_id=34&func=browse


   Can someone please fix this?


   Or is it now the official policy of R-Forge to ask people to go 
someplace else, e.g., GitHub?


   From what I know, the basic design of R-Forge is vastly superior 
to GitHub for packages submitted to CRAN.  However, I've encountered 
numerous reliability problems with R-Forge in recent years.


   Thanks,
   Spencer Graves



[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-27 Thread Spencer Graves
  Thanks to Duncan, Lionel and Henrik for their quick replies. I 
have further questions:



        1.  Will GitHub automatically transfer the commits I made 
to R-Forge in the past couple of days?  R-Forge is now at Rev. 420, and 
GitHub is still at 418.  Will 419 and 420 be automatically mirrored onto 
"https://github.com/rforge/ecdat"; sometime in the next 24 hours or so?  
Is there something easy I can do to force that update?



        2.  Is there a way to make this GitHub version the master?  
It currently says it is a 'Read-only mirror of "ecdat" from r-forge 
SVN.'  I can probably change "r-forge.r-project.org/projects/ecdat" so 
I'm the only one authorized to make changes there and then stop 
committing changes there.  However, before I do that, I'd want to make 
sure I can commit directly to the GitHub version, etc.



        3.  How can I make myself the owner and a contributor for 
the GitHub version?  I'm a "Project Admin" on the R-Forge version, but 
currently no one can make any changes to the GitHub version except via 
R-Forge.  There must be a recommended migration process.



  I could create a separate version of this package on GitHub, but 
all the history would be lost.



  Thanks again,
  Spencer Graves


On 2019-06-26 10:35, Lionel Henry wrote:

On 26 Jun 2019, at 17:25, Duncan Murdoch  wrote:

R-Forge is mirrored on Github; see https://github.com/rforge/ecdat, for 
example.  That shows 418 commits in its history; presumably that's the full 
R-forge history.  I think that's newer than Michael Friendly's gist.

So I suspect (but haven't tried to do this) that migration now is as simple as 
doing a Github fork to your own Github account, and then basically forgetting 
about the R-forge stuff, or deleting it (and I don't know how to do that).

I think it's better to avoid the Fork button in this case, because forks are
treated specially in the Github UI. In this case you'll want your repo to
appear as a main repo, and not a fork. AFAIK the only way to unfork a repo
is to ask the Github staff to do it.

So instead of forking, use the "+" button on github.com and select
"Import a repository". This supports both git and svn repos.

Best,
Lionel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-27 Thread Spencer Graves
 � Thanks.� I'm still having problems:


 ��� ��� 1.� I went to "github.com" and logged in with my standard 
GitHub account


 ��� ��� 2.� Then I clicked "+" in the upper right, just left of my 
GitHub ID icon, and selected "Import a repository", as Lionel suggested.


 ��� ��� 3.� " Your old repository�s clone URL" = 
"https://r-forge.r-project.org/projects/ecdat/"; with "Name" = "Ecdat".


 ��� �� �� ��� ** >> This failed, first giving me a 500 failure 
code, then reporting " Repository creation failed."� When I tried it 
again, I got, "The repository Ecdat already exists on this account."


 � What do you suggest I try next?


 � Thanks,
 � Spencer


On 2019-06-26 12:02, Lionel Henry wrote:
> I think all 3 issues are solved by:
>
> 1. Use the "+" button on github.com <http://github.com>�and select 
> "Import a repository".
> 2. Pass the URL of your SVN repo.
>
> Lionel
>
>> On 26 Jun 2019, at 18:58, Spencer Graves > <mailto:spencer.gra...@prodsyse.com>> wrote:
>>
>> � Thanks to Duncan, Lionel and Henrik for their quick replies. I 
>> have further questions:
>>
>>
>> ��� ��  1.� Will GitHub automatically transfer the commits I made 
>> to R-Forge in the past couple of days? R-Forge is now at Rev. 420, 
>> and GitHub is still at 418. Will 419 and 420 be automatically 
>> mirrored onto "https://github.com/rforge/ecdat"; sometime in the next 
>> 24 hours or so?� Is there something easy I can do to force that update?
>>
>>
>> ��� ��  2.� Is there a way to make this GitHub version the 
>> master?� It currently says it is a 'Read-only mirror of "ecdat" from 
>> r-forge SVN.'� I can probably change 
>> "r-forge.r-project.org/projects/ecdat 
>> <http://r-forge.r-project.org/projects/ecdat>" so I'm the only one 
>> authorized to make changes there and then stop committing changes 
>> there.� However, before I do that, I'd want to make sure I can commit 
>> directly to the GitHub version, etc.
>>
>>
>> ��� ��  3.� How can I make myself the owner and a contributor for 
>> the GitHub version?� I'm a "Project Admin" on the R-Forge version, 
>> but currently no one can make any changes to the GitHub version 
>> except via R-Forge.� There must be a recommended migration process.
>>
>>
>> � I could create a separate version of this package on GitHub, 
>> but all the history would be lost.
>>
>>
>> � Thanks again,
>> � Spencer Graves
>>
>>
>> On 2019-06-26 10:35, Lionel Henry wrote:
>>>> On 26 Jun 2019, at 17:25, Duncan Murdoch >>> <mailto:murdoch.dun...@gmail.com>> wrote:
>>>>
>>>> R-Forge is mirrored on Github; see https://github.com/rforge/ecdat, 
>>>> for example. �That shows 418 commits in its history; presumably 
>>>> that's the full R-forge history. �I think that's newer than Michael 
>>>> Friendly's gist.
>>>>
>>>> So I suspect (but haven't tried to do this) that migration now is 
>>>> as simple as doing a Github fork to your own Github account, and 
>>>> then basically forgetting about the R-forge stuff, or deleting it 
>>>> (and I don't know how to do that).
>>> I think it's better to avoid the Fork button in this case, because 
>>> forks are
>>> treated specially in the Github UI. In this case you'll want your 
>>> repo to
>>> appear as a main repo, and not a fork. AFAIK the only way to unfork 
>>> a repo
>>> is to ask the Github staff to do it.
>>>
>>> So instead of forking, use the "+" button on github.com 
>>> <http://github.com> and select
>>> "Import a repository". This supports both git and svn repos.
>>>
>>> Best,
>>> Lionel
>>
>


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-27 Thread Spencer Graves
Hi, Henrik Singmann et al.:


   Thanks for the suggestions.  I tried again to pull 
"https://github.com/sbgraves237/Ecdat"; from R-Forge, with the same 
"Error 500" as before.  Then I tried pulling from 
"https://github.com/rforge/ecdat";, which seemed to work ... AND the copy 
I pulled was at the latest revisions I had posted to R-Forge (520), so 
that makes it easier going forward.


   What do you suggest I do next?  I'm thinking of the following:


         1.  Clone a copy of "https://github.com/sbgraves237/Ecdat"; 
to my local computer and confirm that it works.


         2.  Modify "https://r-forge.r-project.org/projects/ecdat/"; 
to make me the only remaining project member, if I can.


         3.  Contact GitHub support and ask them if they can delete 
"https://github.com/rforge/ecdat";, because it is an orphan with 0 
contributors, and anyone who might want it should be referred to 
"https://github.com/sbgraves237/Ecdat";.


      4.  Email all the previous project members on 
"https://r-forge.r-project.org/projects/ecdat/"; to tell them what I've 
done, in case they want to do anything more with this in the future.


   I believe I know how to do 1, 2, and 4, and I can probably figure 
out 3.  However, before I start on this, I felt a need to thank everyone 
who contributed to this thread and invite comments, especially if 
someone thinks I might be better off doing something different.


   Spencer Graves


On 2019-06-26 16:34, Henrik Singmann wrote:
> Whereas it is true that one has to contact GitHub to detach a GitHub 
> repository, it really is no problem (or at least was no problem in 
> 2016). I wanted to do so when I took over the maintainer role of 
> LaplacesDemon which only remained on GitHub as a fork on some other 
> person's private account. So I forked and then contacted 
> GitHub support and simply asked them to remove the "forked form" 
> reference on my new repository. They then quickly detached my 
> repository. As you can see, the "forked from" is gone: 
> https://github.com/LaplacesDemonR/LaplacesDemon
>
> In their response to my request they used the phrasing "Fork is 
> detached." which suggests that this is their preferred term for this 
> step.
>
> Best,
> Henrik
>
>
>
> Am Mi., 26. Juni 2019 um 16:38 Uhr schrieb Lionel Henry 
> mailto:lio...@rstudio.com>>:
>
>
> > On 26 Jun 2019, at 17:25, Duncan Murdoch
> mailto:murdoch.dun...@gmail.com>> wrote:
> >
> > R-Forge is mirrored on Github; see
> https://github.com/rforge/ecdat, for example.  That shows 418
> commits in its history; presumably that's the full R-forge
> history.  I think that's newer than Michael Friendly's gist.
> >
> > So I suspect (but haven't tried to do this) that migration now
> is as simple as doing a Github fork to your own Github account,
> and then basically forgetting about the R-forge stuff, or deleting
> it (and I don't know how to do that).
>
> I think it's better to avoid the Fork button in this case, because
> forks are
> treated specially in the Github UI. In this case you'll want your
> repo to
> appear as a main repo, and not a fork. AFAIK the only way to
> unfork a repo
> is to ask the Github staff to do it.
>
> So instead of forking, use the "+" button on github.com
> <http://github.com> and select
> "Import a repository". This supports both git and svn repos.
>
> Best,
> Lionel
> __
> R-devel@r-project.org <mailto:R-devel@r-project.org> mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>
>
>
> -- 
> Dr. Henrik Singmann
> Assistant Professor, Department of Psychology
> University of Warwick, UK
> http://singmann.org


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-28 Thread Spencer Graves
Thanks to Duncan, Henrik and Henrik, Brian, and Gábor:


   I created a local copy of the new GitHub version using the 
following:

git clone https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git



   That worked in the sense that I got a local copy.  However, after 
I rolled the version number and did "git commit" on the DESCRIPTION 
files, my "git push" command generated the following:


remote: Invalid username or password.
fatal: Authentication failed for 
'https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git/'


   What am I missing?  [Note:  I used my actual GitHub password in 
place of "mypassword" here, and this "Authentication failed" message 
reported the GitHub password I used here.]


   Thanks,
   Spencer


p.s.  I'm doing this under macOS Mojave 10.14.5.  Also,  I added 
".onAttach" functions to the R-Forge versions as Brian G. Peterson 
suggested.  That seemed to work fine.


On 2019-06-28 07:13, Duncan Murdoch wrote:
> On 28/06/2019 6:26 a.m., Gábor Csárdi wrote:
>
>> Instead, you can do as Duncan suggested, and put a README in your 
>> R-Forge
>> repository, that points to *your* GitHub repositor(y/ies). Then the
>> https://github.com/rforge/ecdat read only mirror will pick this up 
>> and will
>> point there as well.
>
> Just for the record:  that was Henrik Singmann's suggestion, I just 
> agreed with it.
>
> Duncan Murdoch
>


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-29 Thread Spencer Graves
Hi, Ott et al.:


   What's the best way to get "Travis CI" to build and test the two 
packages, Ecdat and Ecfun, that have long been combined in the Ecdat 
project?


   Following Ott's advice and studying studying Wickham's "R 
Packages" (http://r-pkgs.had.co.nz/), I was able to configure RStudio so 
it would sync using git with "GitHub.com/sbgraves237/Ecdat".  However, 
when I tried to configure "Travis CI", it said, "No DESCRIPTION file 
found, user must supply their own install and script steps".


       Earlier in this thread, I think someone suggested I make the 
Ecdat and Ecfun packages separate projects on GitHub (though I can't 
find that suggestion now).  This would not be an issue if it were all 
local without version control.  With RStudio managing my interface with 
GitHub, it now seems quite tricky.


   Suggestions?
   Thanks again to all who have offered suggestions so far.  This 
migration from R-Forge to GitHub seems complete except for the automatic 
tests provided via "Travis CI".


   Spencer


On 2019-06-28 22:25, Ott Toomet wrote:
> Apparently your username/password are wrong.  Can you clone/push from 
> other repos?
>
> You do not need authorization when cloning a public repo, so even 
> incorrect credentials may work (haven't tested this though).  But for 
> push you have to have that in order.
>
> I suggest you create ssh keys, upload those to GH, and use ssh 
> authorization instead of https.
>
> Cheers,
> Ott
>
> On Fri, Jun 28, 2019 at 8:18 PM Spencer Graves 
> mailto:spencer.gra...@prodsyse.com>> wrote:
>
> Thanks to Duncan, Henrik and Henrik, Brian, and Gábor:
>
>
>    I created a local copy of the new GitHub version using the
> following:
>
> git clone
> https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git
>
>
>
>    That worked in the sense that I got a local copy. However,
> after
> I rolled the version number and did "git commit" on the DESCRIPTION
> files, my "git push" command generated the following:
>
>
> remote: Invalid username or password.
> fatal: Authentication failed for
> 'https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git/'
>
>
>    What am I missing?  [Note:  I used my actual GitHub
> password in
> place of "mypassword" here, and this "Authentication failed" message
> reported the GitHub password I used here.]
>
>
>    Thanks,
>    Spencer
>
>
> p.s.  I'm doing this under macOS Mojave 10.14.5.  Also,  I added
> ".onAttach" functions to the R-Forge versions as Brian G. Peterson
> suggested.  That seemed to work fine.
>
>
> On 2019-06-28 07:13, Duncan Murdoch wrote:
> > On 28/06/2019 6:26 a.m., Gábor Csárdi wrote:
> >
> >> Instead, you can do as Duncan suggested, and put a README in your
> >> R-Forge
> >> repository, that points to *your* GitHub repositor(y/ies). Then the
> >> https://github.com/rforge/ecdat read only mirror will pick this up
> >> and will
> >> point there as well.
> >
> > Just for the record:  that was Henrik Singmann's suggestion, I just
> > agreed with it.
> >
> > Duncan Murdoch
> >
>
>
>         [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org <mailto:R-devel@r-project.org> mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-06-29 Thread Spencer Graves

Hi, Henrik et al.:


  What's your favorite documentation on how to make two GitHub 
projects from one containing two packages?



  Currently, "github.com/sbgraves237/Ecdat" consists primarily of a 
directory "pkg" with subdirectories "Ecdat" and "Ecfun" containing the 
two packages.  I need to know how to do the following:



        1.  Extract "github.com/sbgraves237/Ecdat/pkg/Ecfun" to 
create  "github.com/sbgraves237/Ecfun".



     2.  Elevate "github.com/sbgraves237/Ecdat/pkg/Ecdat" to 
"github.com/sbgraves237/Ecdat", discarding the other files in the 
original "github.com/sbgraves237/Ecdat/".



  This sounds like it could be accomplished relatively easily by 
someone with sufficient understanding of "git" and GitHub.  I could use 
suggestions on how to do this -- or at least on how to find 
documentation on how to do this.



      Thanks,
      Spencer


On 2019-06-29 14:09, Henrik Bengtsson wrote:

On Sat, Jun 29, 2019 at 9:43 AM Spencer Graves
 wrote:

Hi, Ott et al.:


What's the best way to get "Travis CI" to build and test the two
packages, Ecdat and Ecfun, that have long been combined in the Ecdat
project?


Following Ott's advice and studying studying Wickham's "R
Packages" (http://r-pkgs.had.co.nz/), I was able to configure RStudio so
it would sync using git with "GitHub.com/sbgraves237/Ecdat".  However,
when I tried to configure "Travis CI", it said, "No DESCRIPTION file
found, user must supply their own install and script steps".


Earlier in this thread, I think someone suggested I make the
Ecdat and Ecfun packages separate projects on GitHub (though I can't
find that suggestion now).  This would not be an issue if it were all
local without version control.  With RStudio managing my interface with
GitHub, it now seems quite tricky.

I'm 99.999% confident that your life will be much much easier if you
keep one R package per repository.  If you don't, you'll probably be
very lonely when it comes to tools etc.  There are built-in 'git'
commands, but also git utility tools, for extracting a subset of
folders/files from git repository into new git repositories.  You'll
still preserve the commit history.  I would deal with this in the
terminal, using the 'git' client and possible some extraction tool.

Also, while you spend time on this, have a look at the commit
authorship that I mentioned previously.  It's nice to have that in
place later.

After you got the above in place, then .travis.yml and appveyor.yml is
pretty straightforward (might even be a copy'n'paste).

Finally, I saw you put your credentials in the URL when you cloned.  I
don't think that's safe, your GitHub credentials will be stored in the
./.git/config file.  Instead, just clone with:

git clone https://github.com/sbgraves237/Ecdat.git

You can then configure git to cache your HTTPS credentials for a
certain time, e.g. 120 minutes, so you don't have to enter them each
time you pull/push.  See https://git-scm.com/docs/git-credential-cache
for details.  That's what I tell new-comers to Git(Hub|Lab|...) to
use.  Personally, I add my public SSH key to GitHub and then clone
with the ssh protocol:

git clone g...@github.com:sbgraves237/Ecdat.git

That way my I never have to worry entering my credentials.

/Henrik



Suggestions?
Thanks again to all who have offered suggestions so far.  This
migration from R-Forge to GitHub seems complete except for the automatic
tests provided via "Travis CI".


Spencer


On 2019-06-28 22:25, Ott Toomet wrote:

Apparently your username/password are wrong.  Can you clone/push from
other repos?

You do not need authorization when cloning a public repo, so even
incorrect credentials may work (haven't tested this though).  But for
push you have to have that in order.

I suggest you create ssh keys, upload those to GH, and use ssh
authorization instead of https.

Cheers,
Ott

On Fri, Jun 28, 2019 at 8:18 PM Spencer Graves
mailto:spencer.gra...@prodsyse.com>> wrote:

 Thanks to Duncan, Henrik and Henrik, Brian, and Gábor:


I created a local copy of the new GitHub version using the
 following:

 git clone
 https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git



That worked in the sense that I got a local copy. However,
 after
 I rolled the version number and did "git commit" on the DESCRIPTION
 files, my "git push" command generated the following:


 remote: Invalid username or password.
 fatal: Authentication failed for
 'https://sbgraves237:mypassw...@github.com/sbgraves237/Ecdat.git/'


What am I missing?  [Note:  I used my actual GitHub
 

Re: [Rd] R-Forge > GitHub?

2019-06-30 Thread Spencer Graves




On 2019-06-30 06:58, Joshua Ulrich wrote:



I imported both packages into separate repositories:
https://github.com/joshuaulrich/tmp-ecdat
https://github.com/joshuaulrich/tmp-ecfun

I changed your email address on your R-Forge commits to match your
GitHub email address, so R-Forge commits would be associated with your
GitHub account.  I also omitted the "move" commit from Ecdat, and the
"obsolete > GitHub" commits from both packages.  I've attached a file
with the commands I used, if anyone is interested.

You can use my repos by cloning them to your local machine, adding
your repos as new remotes, and pushing to them.  You would need to run
these commands (untested):

### clone my GitHub repo to your machine
git clone g...@github.com:joshuaulrich/tmp-ecfun.git Ecdat



Thanks so much.  Sadly, I'm still having troubles.  This "git clone ..." 
generates:



Enter passphrase for key '/Users/sbgraves/.ssh/id_rsa':


  Sadly, I don't know the passphrase it's looking for here, and I 
don't know how to find what it's looking for.  Under GitHub > Settings > 
"SSH and GPG keys", I see an SSH key dated two days ago, when I cloned 
Ecdat from within RStudio.  And in "~.ssh" I see files id_rsa and 
id_rsa.pub, both created two days ago.



  What do you suggest I try to get past this?


  Thanks again for all your help.


  Spencer Graves


cd Ecdat
### rename my GitHub repo remote from 'origin' to 'tmp'
git remote rename origin tmp
### add your GitHub repo remote as 'origin'
### NOTE: this should be a new, clean repo.
###Rename your existing 'Ecdat' so you don't overwrite it
git remote add origin https://github.com/sbgraves237/Ecdat
### push to your GitHub repo
git push -u origin master

Then you need to run similar commands for Ecfun.

Best,
Josh


Thanks,
Spencer


On 2019-06-29 14:09, Henrik Bengtsson wrote:

On Sat, Jun 29, 2019 at 9:43 AM Spencer Graves
 wrote:

Hi, Ott et al.:


 What's the best way to get "Travis CI" to build and test the two
packages, Ecdat and Ecfun, that have long been combined in the Ecdat
project?


 Following Ott's advice and studying studying Wickham's "R
Packages" (http://r-pkgs.had.co.nz/), I was able to configure RStudio so
it would sync using git with "GitHub.com/sbgraves237/Ecdat".  However,
when I tried to configure "Travis CI", it said, "No DESCRIPTION file
found, user must supply their own install and script steps".


 Earlier in this thread, I think someone suggested I make the
Ecdat and Ecfun packages separate projects on GitHub (though I can't
find that suggestion now).  This would not be an issue if it were all
local without version control.  With RStudio managing my interface with
GitHub, it now seems quite tricky.

I'm 99.999% confident that your life will be much much easier if you
keep one R package per repository.  If you don't, you'll probably be
very lonely when it comes to tools etc.  There are built-in 'git'
commands, but also git utility tools, for extracting a subset of
folders/files from git repository into new git repositories.  You'll
still preserve the commit history.  I would deal with this in the
terminal, using the 'git' client and possible some extraction tool.

Also, while you spend time on this, have a look at the commit
authorship that I mentioned previously.  It's nice to have that in
place later.

After you got the above in place, then .travis.yml and appveyor.yml is
pretty straightforward (might even be a copy'n'paste).

Finally, I saw you put your credentials in the URL when you cloned.  I
don't think that's safe, your GitHub credentials will be stored in the
./.git/config file.  Instead, just clone with:

git clone https://github.com/sbgraves237/Ecdat.git

You can then configure git to cache your HTTPS credentials for a
certain time, e.g. 120 minutes, so you don't have to enter them each
time you pull/push.  See https://git-scm.com/docs/git-credential-cache
for details.  That's what I tell new-comers to Git(Hub|Lab|...) to
use.  Personally, I add my public SSH key to GitHub and then clone
with the ssh protocol:

git clone g...@github.com:sbgraves237/Ecdat.git

That way my I never have to worry entering my credentials.

/Henrik


 Suggestions?
 Thanks again to all who have offered suggestions so far.  This
migration from R-Forge to GitHub seems complete except for the automatic
tests provided via "Travis CI".


 Spencer


On 2019-06-28 22:25, Ott Toomet wrote:

Apparently your username/password are wrong.  Can you clone/push from
other repos?

You do not need authorization when cloning a public repo, so even
incorrect credentials may work (haven't tested this t

Re: [Rd] R-Forge > GitHub?

2019-07-03 Thread Spencer Graves
   Thanks so much for your help.


   Now your "git push -u origin master" was "![rejected]", after 
creating a new SSH and after your "git clone" and other "git remote 
rename ..." commands seemed to work:


$ git clone g...@github.com:joshuaulrich/tmp-ecfun.git Ecdat
# Cloning into 'Ecdat'... done.

$ cd Ecdat/
$ git remote rename origin tmp
$ git remote add origin https://github.com/sbgraves237/Ecdat
$ git push -u origin master
#[Username & password OK]
To https://github.com/sbgraves237/Ecdat
  ! [rejected]    master -> master (fetch first)
error: failed to push some refs to 'https://github.com/sbgraves237/Ecdat'
hint: Updates were rejected because the remote contains work that you do
hint: not have locally. This is usually caused by another repository pushing
hint: to the same ref. You may want to first integrate the remote changes
hint: (e.g., 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details.
SpenceravessMBP:Ecdat sbgraves$


   Suggestions?
   Thanks again,
   Spencer Graves


On 2019-07-01 01:05, Ott Toomet wrote:
> Apparently you created id_rsa key pair with a passphrase. Passphrase 
> is like an additional password protection layer on your ssh key.  I 
> don't know how did you create it.  But you can always create a new one 
> (you should delete the old one before you create a new one) using the 
> shell command 'ssh-keygen'.  It asks for a passphrase, just push enter 
> for an empty passphrase (twice).  You also have to update the ssh 
> public key (id_rsa.pub) on github by supplying the new public key 
> (id_rsa.pub).
>
> There are some implications you should be aware of:
> * if you delete id_rsa*, you cannot use any ssh authorization that 
> relies on this key any more (that's why you have to update on GH).  
> From the what you write (... created 2 days ago) I guess you do not 
> use these keys elsewhere but I may be wrong.
> * if you supply empty passphrase, you bypass the optional extra 
> security layer.  I think this is OK for open source software 
> development on your personal computer but your preferences/situation 
> may differ.
> * You cannot use the same keys with passphrase if they are created 
> without one.  This is likely not an issue, but if it turns out to be a 
> problem, you can either add passphrase to the default keys, or create 
> another set of keys, passphrase protected.
>
> Cheers,
> Ott
>
>
> On Sun, Jun 30, 2019 at 9:51 PM Spencer Graves 
> mailto:spencer.gra...@prodsyse.com>> wrote:
>
>
>
> On 2019-06-30 06:58, Joshua Ulrich wrote:
> 
>
> > I imported both packages into separate repositories:
> > https://github.com/joshuaulrich/tmp-ecdat
> > https://github.com/joshuaulrich/tmp-ecfun
> >
> > I changed your email address on your R-Forge commits to match your
> > GitHub email address, so R-Forge commits would be associated
> with your
> > GitHub account.  I also omitted the "move" commit from Ecdat,
> and the
> > "obsolete > GitHub" commits from both packages.  I've attached a
> file
> > with the commands I used, if anyone is interested.
> >
> > You can use my repos by cloning them to your local machine, adding
> > your repos as new remotes, and pushing to them.  You would need
> to run
> > these commands (untested):
> >
> > ### clone my GitHub repo to your machine
> > git clone g...@github.com:joshuaulrich/tmp-ecfun.git Ecdat
>
>
> Thanks so much.  Sadly, I'm still having troubles.  This "git
> clone ..."
> generates:
>
>
> Enter passphrase for key '/Users/sbgraves/.ssh/id_rsa':
>
>
>    Sadly, I don't know the passphrase it's looking for here,
>     and I
> don't know how to find what it's looking for.  Under GitHub >
> Settings >
> "SSH and GPG keys", I see an SSH key dated two days ago, when I
> cloned
> Ecdat from within RStudio.  And in "~.ssh" I see files id_rsa and
> id_rsa.pub, both created two days ago.
>
>
>    What do you suggest I try to get past this?
>
>
>    Thanks again for all your help.
>
>
>    Spencer Graves
>
> > cd Ecdat
> > ### rename my GitHub repo remote from 'origin' to 'tmp'
> > git remote rename origin tmp
> > ### add your GitHub repo remote as 'origin'
> > ### NOTE: this should be a new, clean repo.
>

[Rd] .travis.yml ... most likely included in error

2019-07-14 Thread Spencer Graves

Hello:


  Suggestions for whomever maintains "R CMD":


        1.  Can you change it so it doesn't complain about the 
presence of ".travis.yml", at least on GitHub?



        2.  What do you suggest people do to find error messages in 
the output?  I ask, because I'm getting "build failing" from 
travis-ci.org, but I can't see what failed in the following:



https://travis-ci.org/sbgraves237/Ecdat/builds/558528361?utm_medium=notification&utm_source=email


          https://api.travis-ci.org/v3/job/558528362/log.txt


  Or are thesejust Travis-CI problems?  If yes, what would you 
suggest they do?



  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] .travis.yml ... most likely included in error

2019-07-14 Thread Spencer Graves
Hi, Danny et al.:


   Thanks.  I found ".Rbuildignore" on my local copy that was, 
however, not tracked by git.  I added that, then committed and pushed 
the change.


   Obviously, I'm new to travis and am still learning how to read 
and understand what it's saying -- especially on how to find "warnings, 
treating as errors".


   Thanks again,
   Spencer Graves


On 2019-07-14 10:08, Danny Smith wrote:
> Hi Spencer,
>
> To get rid of the .travis.yml note add a .Rbuildignore file with this 
> line:
> ^\.travis\.yml$
> This will exclude the file from the build.
>
> The build is failing because of a warning. As noted in the log, Travis 
> is treating a warning as an error:
> Found warnings, treating as errors
> It's a bit hard to find the warning in the logs because of all the 
> pdfTeX output but it's a warning about uncompressed datasets on line 3179:
> https://travis-ci.org/sbgraves237/Ecdat/builds/558528361#L3179
>
> You could try resaving your datasets, there are a couple of 
> suggestions here:
> https://stackoverflow.com/questions/32605623/how-to-compress-saves-in-r-package-build/47074811
>
> Cheers,
> Danny
>
>
> On Mon., 15 Jul. 2019, 00:32 Spencer Graves, 
> mailto:spencer.gra...@prodsyse.com>> wrote:
>
> Hello:
>
>
>    Suggestions for whomever maintains "R CMD":
>
>
>      1.  Can you change it so it doesn't complain about the
> presence of ".travis.yml", at least on GitHub?
>
>
>      2.  What do you suggest people do to find error
> messages in
> the output?  I ask, because I'm getting "build failing" from
> travis-ci.org <http://travis-ci.org>, but I can't see what failed
> in the following:
>
>
> 
> https://travis-ci.org/sbgraves237/Ecdat/builds/558528361?utm_medium=notification&utm_source=email
>
>
> https://api.travis-ci.org/v3/job/558528362/log.txt
>
>
>    Or are thesejust Travis-CI problems?  If yes, what would you
> suggest they do?
>
>
>    Thanks,
>    Spencer Graves
>
> __
> R-devel@r-project.org <mailto:R-devel@r-project.org> mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge > GitHub?

2019-07-14 Thread Spencer Graves
  Thanks to Ott and others, I now have separate GitHub 
repositories, one for each of the packages combined in the Ecdat R-Forge 
project.  In case it might help others in the future, I will summarize 
here key things I did to make this transition:



        1.  I first copied the "Ecfun" package into its own 
directory on my local computer and created a separate GitHub repository 
for that.  I lost the history in doing so, but I can live without that 
history.



        2.  I moved the contents of "~Ecdat/pkg/Ecdat" to "~Ecdat" 
and deleted the now-empty ""pkg/Ecdat" subdirectory.  I first tried to 
do this in RStudio, but wasn't sure it was done correctly.  So I used 
"“git reset --hard HEAD” to revert all that. Then I copied the material 
in Finder on my Mac, so I could see what I was doing.  Then I did "git 
add" of the individual files and folders in a Terminal plus "git rm -r 
pkg" to delete



        3.  Then I set up automatic checking for both packages 
using Travis CI as described by Hadley's "R Packages" 
(http://r-pkgs.had.co.nz/check.html).  [GitHub complained that 
".travis.yml" didn't belong there.  Danny Smith on Rd told me to add it 
to ".Rbuildignore".  I found it was already there, but ".Rbuildignore" 
was not part of the repository.  Now it is.]



        4.  Along the way, GitHub kept asking for my username and 
password, even though I had established SSH authentication.  I traced 
the problem to the ".git/config" that said, "url = 
https://github.com/sbgraves237/Ecdat".  I changed that line to read "url 
= https://sbgraves237:passw...@github.com/sbgraves237/Ecdat"; (where 
"password" is my GitHub password, which I had to change to make it work 
there, because it included "@" ;-)  oops.



        5.  I also had problems with "Warning: ‘inst/doc’ files":  
Those had not existed in the R-Forge versions but appeared somehow in 
migrating them to GitHub.  I deleted them in a Terminal with "git rm -r 
inst/doc".  After "git commit" and "git push", I found they had been 
deleted from the GitHub repository but not my local computer, so I 
deleted them locally -- without any apparent side effects.



  Thanks again,
  Spencer Graves


On 2019-07-03 23:30, Spencer Graves wrote:

    Thanks so much for your help.


    Now your "git push -u origin master" was "![rejected]", after
creating a new SSH and after your "git clone" and other "git remote
rename ..." commands seemed to work:


$ git clone...@github.com:joshuaulrich/tmp-ecfun.git  Ecdat
# Cloning into 'Ecdat'... done.

$ cd Ecdat/
$ git remote rename origin tmp
$ git remote add originhttps://github.com/sbgraves237/Ecdat
$ git push -u origin master
#[Username & password OK]
Tohttps://github.com/sbgraves237/Ecdat
   ! [rejected]    master -> master (fetch first)
error: failed to push some refs to 'https://github.com/sbgraves237/Ecdat'
hint: Updates were rejected because the remote contains work that you do
hint: not have locally. This is usually caused by another repository pushing
hint: to the same ref. You may want to first integrate the remote changes
hint: (e.g., 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details.
SpenceravessMBP:Ecdat sbgraves$


    Suggestions?
    Thanks again,
    Spencer Graves


On 2019-07-01 01:05, Ott Toomet wrote:

Apparently you created id_rsa key pair with a passphrase. Passphrase
is like an additional password protection layer on your ssh key.  I
don't know how did you create it.  But you can always create a new one
(you should delete the old one before you create a new one) using the
shell command 'ssh-keygen'.  It asks for a passphrase, just push enter
for an empty passphrase (twice).  You also have to update the ssh
public key (id_rsa.pub) on github by supplying the new public key
(id_rsa.pub).

There are some implications you should be aware of:
* if you delete id_rsa*, you cannot use any ssh authorization that
relies on this key any more (that's why you have to update on GH).
>From the what you write (... created 2 days ago) I guess you do not 
use these keys elsewhere but I may be wrong.

* if you supply empty passphrase, you bypass the optional extra
security layer.  I think this is OK for open source software
development on your personal computer but your preferences/situation
may differ.
* You cannot use the same keys with passphrase if they are created
without one.  This is likely not an issue, but if it turns out to be a
problem, you can either add passphrase to the default keys, or create
another set of keys, passphrase protected.

Cheers,
Ott

[Rd] GitHub passwords in .git/config?

2019-07-15 Thread Spencer Graves




On 2019-07-15 10:56, Dirk Eddelbuettel wrote:




Don't write passwords down like this. Your error is likely in expecting _ssh_
authentication over _https_ -- when it works only over ssh. Use the alternate
form for a remote e.g. one that looks like g...@github.com:emacs-ess/ESS.git



  I'm confused.  I changed that line to:


            url = https://g...@github.com:sbgraves237/sbgraves237/Ecdat


  Then when I did "git pull" I got:


fatal: unable to access 
'https://g...@github.com:sbgraves237/sbgraves237/Ecdat/': Port number 
ended with 's'



      ???
      Thanks,
      Spencer


Hth, Dirk



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] GitHub passwords in .git/config?

2019-07-15 Thread Spencer Graves

I'm diverging:  Now I get:


>>> git pull
ssh: Could not resolve hostname github.com:sbgraves237: nodename nor 
servname provided, or not known

fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.


  ** With .git/config as follows:


[core]
    repositoryformatversion = 0
    filemode = true
    bare = false
    logallrefupdates = true
    ignorecase = true
    precomposeunicode = true
[remote "origin"]
    url = ssh://g...@github.com:sbgraves237/Ecdat.git
    fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
    remote = origin
    merge = refs/heads/master


  I have an SSH key on my GitHub account, which says it was "Added 
on Jul 3, 2019 Last used within the last 2 weeks — Read/write".



  Should I delete my current local copies and clone them fresh from 
GitHub?

  Spencer


On 2019-07-15 12:01, Brian G. Peterson wrote:

it would be:

ssh://g...@github.com:sbgraves237/Ecdat.git


On Mon, 2019-07-15 at 11:41 -0500, Spencer Graves wrote:

On 2019-07-15 10:56, Dirk Eddelbuettel wrote:




Don't write passwords down like this. Your error is likely in
expecting _ssh_
authentication over _https_ -- when it works only over ssh. Use the
alternate
form for a remote e.g. one that looks like g...@github.com:emacs-
ess/ESS.git

I'm confused.  I changed that line to:


  url =
https://g...@github.com:sbgraves237/sbgraves237/Ecdat


Then when I did "git pull" I got:


fatal: unable to access
'https://g...@github.com:sbgraves237/sbgraves237/Ecdat/': Port number
ended with 's'


???
Thanks,
Spencer


Hth, Dirk


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] GitHub passwords in .git/config?

2019-07-15 Thread Spencer Graves

Thanks, Marcel:


  That did it.


  My next challenge is to replicate it on a Windows 10 machine.


  Spencer


On 2019-07-15 12:54, Marcel Ramos wrote:

Hi Spencer,

The first line in the `[remote "origin"]` section should read:

```

url = 
g...@github.com:sbgraves237/Ecdat.git<mailto:g...@github.com:sbgraves237/Ecdat.git>

```

Generally, I add these configs by doing a clone on the command line such as:


git clone 
g...@github.com:sbgraves237/Ecdat.git<mailto:g...@github.com:sbgraves237/Ecdat.git>

so that I don't have to mess with the config file.


Best,

Marcel

On 7/15/19 1:48 PM, Spencer Graves wrote:
I'm diverging:  Now I get:



git pull

ssh: Could not resolve hostname github.com:sbgraves237: nodename nor servname 
provided, or not known
fatal: Could not read from remote repository.

Please make sure you have the correct access rights
and the repository exists.


   ** With .git/config as follows:


[core]
 repositoryformatversion = 0
 filemode = true
 bare = false
 logallrefupdates = true
 ignorecase = true
 precomposeunicode = true
[remote "origin"]
 url = 
ssh://g...@github.com:sbgraves237/Ecdat.git<mailto:ssh://g...@github.com:sbgraves237/Ecdat.git>
 fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
 remote = origin
 merge = refs/heads/master


   I have an SSH key on my GitHub account, which says it was "Added on Jul 3, 
2019 Last used within the last 2 weeks — Read/write".


   Should I delete my current local copies and clone them fresh from GitHub?
   Spencer


On 2019-07-15 12:01, Brian G. Peterson wrote:
it would be:

ssh://g...@github.com:sbgraves237/Ecdat.git<mailto:ssh://g...@github.com:sbgraves237/Ecdat.git>


On Mon, 2019-07-15 at 11:41 -0500, Spencer Graves wrote:
On 2019-07-15 10:56, Dirk Eddelbuettel wrote:



Don't write passwords down like this. Your error is likely in
expecting _ssh_
authentication over _https_ -- when it works only over ssh. Use the
alternate
form for a remote e.g. one that looks like 
g...@github.com:emacs<mailto:g...@github.com:emacs>-
ess/ESS.git
 I'm confused.  I changed that line to:


   url =
https://g...@github.com:sbgraves237/sbgraves237/Ecdat<mailto:https://g...@github.com:sbgraves237/sbgraves237/Ecdat>


 Then when I did "git pull" I got:


fatal: unable to access
'https://g...@github.com:sbgraves237/sbgraves237/Ecdat/<mailto:https://g...@github.com:sbgraves237/sbgraves237/Ecdat/>':
 Port number
ended with 's'


 ???
 Thanks,
 Spencer

Hth, Dirk

__
R-devel@r-project.org<mailto:R-devel@r-project.org> mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org<mailto:R-devel@r-project.org> mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


This email message may contain legally privileged and/or confidential 
information.  If you are not the intended recipient(s), or the employee or 
agent responsible for the delivery of this message to the intended 
recipient(s), you are hereby notified that any disclosure, copying, 
distribution, or use of this email message is prohibited.  If you have received 
this message in error, please notify the sender immediately by e-mail and 
delete this email message from your computer. Thank you.
[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] New matrix function

2019-10-11 Thread Spencer Graves




On 2019-10-11 04:45, Duncan Murdoch wrote:

On 11/10/2019 6:44 a.m., Morgan Morgan wrote:

Hi All,

I was looking for a function to find a small matrix inside a larger 
matrix

in R similar to the one described in the following link:

https://www.mathworks.com/matlabcentral/answers/194708-index-a-small-matrix-in-a-larger-matrix 



I couldn't find anything.

The above function can be seen as a "generalisation" of the "which"
function as well as the function described in the following post:

https://coolbutuseless.github.io/2018/04/03/finding-a-length-n-needle-in-a-haystack/ 



Would be possible to add such a function to base R?

I am happy to work with someone from the R core team (if you wish) and
suggest an implementation in C.


That seems like it would sometimes be a useful function, and maybe 
someone will point out a package that already contains it.  But if 
not, why would it belong in base R?



  The natural thing could be to add it to another existing package.


  A list of different search tools appear in the Wikiversity 
article on "Searching R packages".[1]  I especially like the "sos" 
package, which includes a vignette, [2] but I also use RDocumentation 
and occasionally Rseek.  Google Advanced Search[3] is also very good;  
I've used that for other things, but not searching for R packages.



  I've had modest luck suggesting additions to other packages if I 
write the function and documentation with good examples that tend to 
ensure quality.  Some maintainers reject my suggestions;  other have 
accepted them.



  Spencer Graves


[1]
https://en.wikiversity.org/wiki/Searching_R_Packages


[2] Caveat:  I wrote both that Wikiversity article and the "sos" 
package, so I'm biased.



[3]
https://www.google.com/advanced_search



Duncan Murdoch

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] default col.names from data.frame not the same as as.data.frame

2019-12-09 Thread Spencer Graves

Hello, All:


  Consider:


> data.frame(matrix(1:2, 1))
  X1 X2
1  1  2
> as.data.frame(matrix(1:2, 1))
  V1 V2
1  1  2


  I ask, because I got different default names running the same 
numbers through BMA:::bic.glm.matrix and BMA:::bic.glm.data.frame, so I 
thought I'd ask this group what names you all prefer for examples like 
these.



  Thanks,
      Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] "simulate" does not include variability in parameter estimation

2019-12-26 Thread Spencer Graves

Hello, All:


  The default "simulate" method for lm and glm seems to ignore the 
sampling variance of the parameter estimates;  see the trivial lm and 
glm examples below.  Both these examples estimate a mean with formula = 
x~1.  In both cases, the variance of the estimated mean is 1.



        * In the lm example with x0 = c(-1, 1), var(x0) = 2, and 
var(unlist(simulate(lm(x0~1), 1, 1))) is 2.0064.  Shouldn't it be 3 
= var(mean(x0)) + var(x0) = (2/2) + 2?



        * In the glm example with x1=1, 
var(unlist(simulate(glm(x1~1, poisson), 1, 1))) = 1.006. Shouldn't 
it be 2 = var(glm estimate of the mean) + var(simulated Poisson 
distribution) = 1 + 1?



  I'm asking, because I've recently written "simulate" methods for 
objects of class stats::glm and BMA::bic.glm, where my primary interest 
was simulating the predicted mean with "newdata".  I'm doing this, so I 
can get Monte Carlo prediction intervals.  My current code for 
"simulate.glm" and "simulate.bic.glm" are available in the development 
version of the "Ecfun" package on GitHub:



https://github.com/sbgraves237/Ecfun


  Comparing my new code with "stats:::simulate.lm" raises the 
following questions in my mind regarding "simulate" of a fit object:



        1.  Shouldn't "simulate" start by simulating the random 
variability in the estimated parameters?  I need that for my current 
application.  If a generic "simulate" function should NOT include this, 
what should we call something that does include this?  And how does the 
current stats:::simulate.lm behavior fit with this?



    2.  Shouldn't "simulate" of a model fit include an option 
for "newdata"?  I need that for my application.



        3.  By comparing with "predict.glm", I felt I needed an 
argument 'type = c("link", "response")'.  "predict.glm" has an argument 
'type = c("link", "response", "terms")'.  I didn't need "terms", so I 
didn't take the time to code that.  However, a general "simulate" 
function should probably include that.



        4.  My application involves assumed Poisson counts.  I need 
to simulate those as well.  If I combined those with "simulate.glm", 
what would I call them?  I can't use the word "response", because that's 
already used with a different meaning. Might "observations" be the 
appropriate term?



  What do you think?
  Thanks,
  Spencer Graves


> x0 <- c(-1, 1)
> var(x0)
[1] 2
> fit0 <- lm(x0~1)
> vcov(fit0)
    (Intercept)
(Intercept)   1
> sim0 <- simulate(fit0, 1, 1)
> var(unlist(sim0))
[1] 2.006408
> x1 <- 1
> fit1 <- glm(x1~1, poisson)
> coef(fit1)
 (Intercept)
4.676016e-11
> exp(coef(fit1))
(Intercept)
  1
> vcov(fit1)
    (Intercept)
(Intercept)   0.903
> sim1 <- simulate(fit1, 1, 1)
> var(unlist(sim1))
[1] 1.00617
> sessionInfo()
R version 3.6.2 (2019-12-12)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Catalina 10.15.2

Matrix products: default
BLAS: 
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets  methods base

loaded via a namespace (and not attached):
[1] compiler_3.6.2 tools_3.6.2

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "simulate" does not include variability in parameter estimation

2019-12-27 Thread Spencer Graves




On 2019-12-27 04:34, Duncan Murdoch wrote:

On 26/12/2019 11:14 p.m., Spencer Graves wrote:

Hello, All:


    The default "simulate" method for lm and glm seems to ignore the
sampling variance of the parameter estimates;  see the trivial lm and
glm examples below.  Both these examples estimate a mean with formula =
x~1.  In both cases, the variance of the estimated mean is 1.


That's how it's documented to operate.  Nothing in the help page 
suggests it would try to simulate parameter values.  Indeed, it 
doesn't have enough information on the distribution to sample from:  
the appropriate distribution to simulate from if you want to include 
uncertainty in the parameter estimates is the posterior distribution, 
but lm and glm take a classical point of view, not a Bayesian point of 
view, so they have no concept of a posterior.



  Thanks for the reply.  What do you suggest for someone who wants 
confidence, prediction and tolerance intervals for newdata for a general 
fit object?



  For a glm object, one could get confidence intervals starting 
with predicted mean and standard errors
from predict(glm(...), newdata, type='link', se.fit=TRUE), then linkinv 
to get the confidence intervals on scale of expected values of the 
random variables.  From that one could compute tolerance intervals.



  Is there a way to get more standard prediction intervals from a 
glm object, other than the Bayesian approach coded into 
Ecfun:::simulate.glm?  And that still doesn't answer the question re. 
confidence intervals for a more general fit object like BMA::bic.glm.



      Comments?
  Thanks,
  Spencer Graves




          * In the lm example with x0 = c(-1, 1), var(x0) = 2, and
var(unlist(simulate(lm(x0~1), 1, 1))) is 2.0064.  Shouldn't it be 3
= var(mean(x0)) + var(x0) = (2/2) + 2?


That calculation ignores the uncertainty in the estimation of sigma.

Duncan Murdoch




          * In the glm example with x1=1,
var(unlist(simulate(glm(x1~1, poisson), 1, 1))) = 1.006. Shouldn't
it be 2 = var(glm estimate of the mean) + var(simulated Poisson
distribution) = 1 + 1?


    I'm asking, because I've recently written "simulate" methods for
objects of class stats::glm and BMA::bic.glm, where my primary interest
was simulating the predicted mean with "newdata".  I'm doing this, so I
can get Monte Carlo prediction intervals.  My current code for
"simulate.glm" and "simulate.bic.glm" are available in the development
version of the "Ecfun" package on GitHub:


https://github.com/sbgraves237/Ecfun


    Comparing my new code with "stats:::simulate.lm" raises the
following questions in my mind regarding "simulate" of a fit object:


          1.  Shouldn't "simulate" start by simulating the random
variability in the estimated parameters?  I need that for my current
application.  If a generic "simulate" function should NOT include this,
what should we call something that does include this?  And how does the
current stats:::simulate.lm behavior fit with this?


      2.  Shouldn't "simulate" of a model fit include an option
for "newdata"?  I need that for my application.


          3.  By comparing with "predict.glm", I felt I needed an
argument 'type = c("link", "response")'.  "predict.glm" has an argument
'type = c("link", "response", "terms")'.  I didn't need "terms", so I
didn't take the time to code that.  However, a general "simulate"
function should probably include that.


          4.  My application involves assumed Poisson counts.  I 
need

to simulate those as well.  If I combined those with "simulate.glm",
what would I call them?  I can't use the word "response", because that's
already used with a different meaning. Might "observations" be the
appropriate term?


    What do you think?
    Thanks,
    Spencer Graves


  > x0 <- c(-1, 1)
  > var(x0)
[1] 2
  > fit0 <- lm(x0~1)
  > vcov(fit0)
      (Intercept)
(Intercept)   1
  > sim0 <- simulate(fit0, 1, 1)
  > var(unlist(sim0))
[1] 2.006408
  > x1 <- 1
  > fit1 <- glm(x1~1, poisson)
  > coef(fit1)
   (Intercept)
4.676016e-11
  > exp(coef(fit1))
(Intercept)
    1
  > vcov(fit1)
      (Intercept)
(Intercept)   0.903
  > sim1 <- simulate(fit1, 1, 1)
  > var(unlist(sim1))
[1] 1.00617
  > sessionInfo()
R version 3.6.2 (2019-12-12)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Catalina 10.15.2

Matrix products: default
BLAS:
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/

[Rd] rpois(9, 1e10)

2020-01-19 Thread Spencer Graves

Hello, All:


  Consider:


Browse[2]> set.seed(1)
Browse[2]> rpois(9, 1e10)
NAs produced[1] NA NA NA NA NA NA NA NA NA


  Should this happen?


  I think that for, say, lambda>1e6, rpois should return rnorm(., 
lambda, sqrt(lambda)).



  For my particular Monte Carlo, I have replaced my call to rpois 
with a call to the following:



 rpois. <- function(n, lambda){
  n2 <- max(length(n), length(lambda))
  n <- rep_len(n, n2)
  lambda <- rep_len(lambda, n2)
#
  big <- (lambda>1e6)
  out <- rep(NA, n2)
  out[big] <- rnorm(sum(big), lambda[big], sqrt(lambda[big]))
  out[!big] <- rpois(sum(!big), lambda[!big])
  out
  }


  Comments?
  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] rpois(9, 1e10)

2020-01-19 Thread Spencer Graves




On 2020-01-19 09:34, Benjamin Tyner wrote:


Hello, All:


    Consider:


Browse[2]> set.seed(1)
Browse[2]> rpois(9, 1e10)
NAs produced[1] NA NA NA NA NA NA NA NA NA


    Should this happen?


    I think that for, say, lambda>1e6, rpois should return rnorm(.,
lambda, sqrt(lambda)).
But need to implement carefully; rpois should always return a 
non-negative integer, whereas rnorm always returns numeric...




  Thanks for the reply.


  However, I think it's not acceptable to get an NA from a number 
that cannot be expressed as an integer.  Whenever a randomly generated 
number would exceed .Machine$integer.max, the choice is between 
returning NA or a non-integer numeric.  Consider:



> 2*.Machine$integer.max
[1] 4294967294
> as.integer(2*.Machine$integer.max)
[1] NA
Warning message:
NAs introduced by coercion to integer range


  I'd rather have the non-integer numeric.


  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] rpois(9, 1e10)

2020-01-19 Thread Spencer Graves
   This issue arose for me in simulations to estimate confidence, 
prediction, and tolerance intervals from glm(., family=poisson) fits 
embedded in a BMA::bic.glm fit using a simulate.bic.glm function I added 
to the development version of Ecfun, available at 
"https://github.com/sbgraves237/Ecfun".  This is part of a vignette I'm 
developing, available at 
"https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd";.
 
This includes a simulated mean of a mixture of Poissons that exceeds 
2e22.  It doesn't seem unreasonable to me to have rpois output a 
numerics rather than integers when a number simulated exceeds 
.Machine$integer.max.  And it does seem to make less sense in such cases 
to return NAs.


    Alternatively, might it make sense to add another argument to 
rpois to give the user the choice?  E.g., an argument "bigOutput" with 
(I hope) default = "numeric" and "NA" as a second option.  Or NA is the 
default, so no code that relied that feature of the current code would 
be broken by the change.  If someone wanted to use arbitrary precision 
arithmetic, they could write their own version of this function with 
"arbitraryPrecision" as an optional value for the "bigOutput" argument.


   Comments?
   Thanks,
   Spencer Graves


On 2020-01-19 10:28, Avraham Adler wrote:
> Technically, lambda can always be numeric. It is the observations 
> which must be integral.
>
> Would hitting everything larger than maxint or maxlonglong with floor 
> or round fundamentally change the distribution? Well, yes, but enough 
> that it would matter over process risk?
>
> Avi
>
> On Sun, Jan 19, 2020 at 11:20 AM Benjamin Tyner  <mailto:bty...@gmail.com>> wrote:
>
> So imagine rpois is changed, such that the storage mode of its return
> value is sometimes integer and sometimes numeric. Then imagine the
> case
> where lambda is itself a realization of a random variable. Do we
> really
> want the storage mode to inherit that randomness?
>
>
> On 1/19/20 10:47 AM, Avraham Adler wrote:
> > Maybe there should be code for 64 bit R to use long long or the
> like?
> >
> > On Sun, Jan 19, 2020 at 10:45 AM Spencer Graves
> >  <mailto:spencer.gra...@prodsyse.com>
> <mailto:spencer.gra...@prodsyse.com
> <mailto:spencer.gra...@prodsyse.com>>> wrote:
> >
> >
> >
> >     On 2020-01-19 09:34, Benjamin Tyner wrote:
> >     >>
> >
>  
> >     >> Hello, All:
> >     >>
> >     >>
> >     >>     Consider:
> >     >>
> >     >>
> >     >> Browse[2]> set.seed(1)
> >     >> Browse[2]> rpois(9, 1e10)
> >     >> NAs produced[1] NA NA NA NA NA NA NA NA NA
> >     >>
> >     >>
> >     >>     Should this happen?
> >     >>
> >     >>
> >     >>     I think that for, say, lambda>1e6, rpois should
> return
> >     rnorm(.,
> >     >> lambda, sqrt(lambda)).
> >     > But need to implement carefully; rpois should always return a
> >     > non-negative integer, whereas rnorm always returns numeric...
> >     >
> >
> >        Thanks for the reply.
> >
> >
> >        However, I think it's not acceptable to get an NA from a
> >     number
> >     that cannot be expressed as an integer.  Whenever a randomly
> >     generated
> >     number would exceed .Machine$integer.max, the choice is between
> >     returning NA or a non-integer numeric.  Consider:
> >
> >
> >      > 2*.Machine$integer.max
> >     [1] 4294967294
> >      > as.integer(2*.Machine$integer.max)
> >     [1] NA
> >     Warning message:
> >     NAs introduced by coercion to integer range
> >
> >
> >        I'd rather have the non-integer numeric.
> >
> >
> >        Spencer
> >
> >     __
> > R-devel@r-project.org <mailto:R-devel@r-project.org>
> <mailto:R-devel@r-project.org <mailto:R-devel@r-project.org>>
> mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
> > --
> > Sent from Gmail Mobile
>
> -- 
> Sent from Gmail Mobile


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] rpois(9, 1e10)

2020-01-19 Thread Spencer Graves



On 2020-01-19 13:01, Avraham Adler wrote:
> Crazy thought, but being that a sum of Poissons is Poisson in the sum, 
> can you break your “big” simulation into the sum of a few smaller 
> ones? Or is the order of magnitude difference just too great?


   I don't perceive that as feasible.  Once I found what was 
generating NAs, it was easy to code a function to return pseudo-random 
numbers using the standard normal approximation to the Poisson for those 
extreme cases.  [For a Poisson with mean = 1e6, for example, the 
skewness (third standardized moment) is 0.001.  At least for my 
purposes, that should be adequate.][1]


   What are the negative consequences of having rpois return 
numerics that are always nonnegative?


   Spencer


[1]  In the code I reported before, I just changed the threshold of 1e6 
to 0.5*.Machine$integer.max.  On my Mac, .Machine$integer.max = 
2147483647 = 2^31 > 1e9.  That still means that a Poisson distributed 
pseudo-random number just under that would have to be over 23000 
standard deviations above the mean to exceed .Machine$integer.max.

>
> On Sun, Jan 19, 2020 at 1:58 PM Spencer Graves 
> mailto:spencer.gra...@prodsyse.com>> wrote:
>
>   This issue arose for me in simulations to estimate
> confidence, prediction, and tolerance intervals from glm(.,
> family=poisson) fits embedded in a BMA::bic.glm fit using a
> simulate.bic.glm function I added to the development version of
> Ecfun, available at "https://github.com/sbgraves237/Ecfun";
> <https://github.com/sbgraves237/Ecfun>. This is part of a vignette
> I'm developing, available at
> 
> "https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd";
> 
> <https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd>.
> This includes a simulated mean of a mixture of Poissons that
> exceeds 2e22.  It doesn't seem unreasonable to me to have rpois
> output a numerics rather than integers when a number simulated
> exceeds .Machine$integer.max.  And it does seem to make less sense
> in such cases to return NAs.
>
>
>    Alternatively, might it make sense to add another argument
> to rpois to give the user the choice?  E.g., an argument
> "bigOutput" with (I hope) default = "numeric" and "NA" as a second
> option.  Or NA is the default, so no code that relied that feature
> of the current code would be broken by the change.  If someone
> wanted to use arbitrary precision arithmetic, they could write
> their own version of this function with "arbitraryPrecision" as an
> optional value for the "bigOutput" argument.
>
>
>   Comments?
>   Thanks,
>   Spencer Graves
>
>
>
> On 2020-01-19 10:28, Avraham Adler wrote:
>> Technically, lambda can always be numeric. It is the observations
>> which must be integral.
>>
>> Would hitting everything larger than maxint or maxlonglong with
>> floor or round fundamentally change the distribution? Well, yes,
>> but enough that it would matter over process risk?
>>
>> Avi
>>
>> On Sun, Jan 19, 2020 at 11:20 AM Benjamin Tyner > <mailto:bty...@gmail.com>> wrote:
>>
>> So imagine rpois is changed, such that the storage mode of
>> its return
>> value is sometimes integer and sometimes numeric. Then
>> imagine the case
>> where lambda is itself a realization of a random variable. Do
>> we really
>> want the storage mode to inherit that randomness?
>>
>>
>> On 1/19/20 10:47 AM, Avraham Adler wrote:
>> > Maybe there should be code for 64 bit R to use long long or
>> the like?
>> >
>> > On Sun, Jan 19, 2020 at 10:45 AM Spencer Graves
>> > > <mailto:spencer.gra...@prodsyse.com>
>> <mailto:spencer.gra...@prodsyse.com
>> <mailto:spencer.gra...@prodsyse.com>>> wrote:
>> >
>> >
>> >
>> >     On 2020-01-19 09:34, Benjamin Tyner wrote:
>> >     >>
>> >
>>  
>> 
>> >     >> Hello, All:
>> >     >>
>> >     >>
>> >     >>     Consider:
>> >     >>
>> >     >>
>> 

Re: [Rd] rpois(9, 1e10)

2020-01-19 Thread Spencer Graves
On my Mac:


str(.Machine)
...
$ integer.max  : int 2147483647
  $ sizeof.long  : int 8
  $ sizeof.longlong  : int 8
  $ sizeof.longdouble    : int 16
  $ sizeof.pointer   : int 8


   On a Windows 10 machine I have, $ sizeof.long : int 4; otherwise 
the same as on my Mac.


   Am I correct that $ sizeof.long = 4 means 4 bytes = 32 bits? 
log2(.Machine$integer.max) = 31.  Then 8 bytes is what used to be called 
double precision (2 words of 4 bytes each)?  And $ sizeof.longdouble = 
16 = 4 words of 4 bytes each?


   Spencer


On 2020-01-19 15:41, Avraham Adler wrote:
> Floor (maybe round) of non-negative numerics, though. Poisson should 
> never have anything after decimal.
>
> Still think it’s worth allowing long long for R64 bit, just for purity 
> sake.
>
> Avi
>
> On Sun, Jan 19, 2020 at 4:38 PM Spencer Graves 
> mailto:spencer.gra...@prodsyse.com>> wrote:
>
>
>
> On 2020-01-19 13:01, Avraham Adler wrote:
>> Crazy thought, but being that a sum of Poissons is Poisson in the
>> sum, can you break your “big” simulation into the sum of a few
>> smaller ones? Or is the order of magnitude difference just too great?
>
>
>   I don't perceive that as feasible.  Once I found what was
> generating NAs, it was easy to code a function to return
> pseudo-random numbers using the standard normal approximation to
> the Poisson for those extreme cases.  [For a Poisson with mean =
> 1e6, for example, the skewness (third standardized moment) is
> 0.001.  At least for my purposes, that should be adequate.][1]
>
>
>   What are the negative consequences of having rpois return
> numerics that are always nonnegative?
>
>
>   Spencer
>
>
> [1]  In the code I reported before, I just changed the threshold
> of 1e6 to 0.5*.Machine$integer.max.  On my Mac,
> .Machine$integer.max = 2147483647 = 2^31 > 1e9. That still means
> that a Poisson distributed pseudo-random number just under that
> would have to be over 23000 standard deviations above the mean to
> exceed .Machine$integer.max.
>
>>
>> On Sun, Jan 19, 2020 at 1:58 PM Spencer Graves
>> > <mailto:spencer.gra...@prodsyse.com>> wrote:
>>
>>   This issue arose for me in simulations to estimate
>> confidence, prediction, and tolerance intervals from glm(.,
>> family=poisson) fits embedded in a BMA::bic.glm fit using a
>> simulate.bic.glm function I added to the development version
>> of Ecfun, available at "https://github.com/sbgraves237/Ecfun";
>> <https://github.com/sbgraves237/Ecfun>. This is part of a
>> vignette I'm developing, available at
>> 
>> "https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd";
>> 
>> <https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd>.
>> This includes a simulated mean of a mixture of Poissons that
>> exceeds 2e22.  It doesn't seem unreasonable to me to have
>> rpois output a numerics rather than integers when a number
>> simulated exceeds .Machine$integer.max.  And it does seem to
>> make less sense in such cases to return NAs.
>>
>>
>>    Alternatively, might it make sense to add another
>> argument to rpois to give the user the choice?  E.g., an
>> argument "bigOutput" with (I hope) default = "numeric" and
>> "NA" as a second option.  Or NA is the default, so no code
>>     that relied that feature of the current code would be broken
>> by the change.  If someone wanted to use arbitrary precision
>> arithmetic, they could write their own version of this
>> function with "arbitraryPrecision" as an optional value for
>> the "bigOutput" argument.
>>
>>
>>   Comments?
>>   Thanks,
>>   Spencer Graves
>>
>>
>>
>> On 2020-01-19 10:28, Avraham Adler wrote:
>>> Technically, lambda can always be numeric. It is the
>>> observations which must be integral.
>>>
>>> Would hitting everything larger than maxint or maxlonglong
>>> with floor or round fundamentally change the distribution?
>>> Well, yes, but enough that it would matter over process risk?
>>>
>>> Avi
>>>
>>> On Sun, Jan 19, 2020 at

Re: [Rd] [External] Re: rpois(9, 1e10)

2020-01-19 Thread Spencer Graves
Thanks to Luke and Avi for their comments.  I wrapped "round" around the 
call to "rnorm" inside my "rpois.".  For "lambda" really big, that 
"round" won't do anything.  However, it appears to give integers in 
floating point representation that are larger than 
.Machine$integer.max.  That sounds very much like what someone would 
want.  Spencer



On 2020-01-19 21:00, Tierney, Luke wrote:

R uses the C 'int' type for its integer data and that is pretty much
universally 32 bit these days. In fact R wont' compile if it is not.
That means the range for integer data is the integers in [-2^31,
+2^31).

It would be good to allow for a larger integer range for R integer
objects, and several of us are thinking about how me might get there.
But it isn't easy to get right, so it may take some time. I doubt
anything can happen for R 4.0.0 this year, but 2021 may be possible.

I few notes inline below:

On Sun, 19 Jan 2020, Spencer Graves wrote:


On my Mac:


str(.Machine)
...
$ integer.max  : int 2147483647
  $ sizeof.long  : int 8
  $ sizeof.longlong  : int 8
  $ sizeof.longdouble    : int 16
  $ sizeof.pointer   : int 8


   On a Windows 10 machine I have, $ sizeof.long : int 4; otherwise
the same as on my Mac.

One of many annoyances of Windows -- done for compatibility with
ancient Window apps.


   Am I correct that $ sizeof.long = 4 means 4 bytes = 32 bits?
log2(.Machine$integer.max) = 31.  Then 8 bytes is what used to be called
double precision (2 words of 4 bytes each)?  And $ sizeof.longdouble =
16 = 4 words of 4 bytes each?

double precision is a floating point concept, not related to integers.

If you want to figure out whether you are running a 32 bit or 64 bit R
look at sizeof.pointer -- 4 means 32 bits, 8 64 bits.

Best,

luke




   Spencer


On 2020-01-19 15:41, Avraham Adler wrote:

Floor (maybe round) of non-negative numerics, though. Poisson should
never have anything after decimal.

Still think it’s worth allowing long long for R64 bit, just for purity
sake.

Avi

On Sun, Jan 19, 2020 at 4:38 PM Spencer Graves
mailto:spencer.gra...@prodsyse.com>> wrote:



 On 2020-01-19 13:01, Avraham Adler wrote:

 Crazy thought, but being that a sum of Poissons is Poisson in the
 sum, can you break your “big” simulation into the sum of a few
 smaller ones? Or is the order of magnitude difference just too great?


   I don't perceive that as feasible.  Once I found what was
 generating NAs, it was easy to code a function to return
 pseudo-random numbers using the standard normal approximation to
 the Poisson for those extreme cases.  [For a Poisson with mean =
 1e6, for example, the skewness (third standardized moment) is
 0.001.  At least for my purposes, that should be adequate.][1]


   What are the negative consequences of having rpois return
 numerics that are always nonnegative?


   Spencer


 [1]  In the code I reported before, I just changed the threshold
 of 1e6 to 0.5*.Machine$integer.max.  On my Mac,
 .Machine$integer.max = 2147483647 = 2^31 > 1e9. That still means
 that a Poisson distributed pseudo-random number just under that
 would have to be over 23000 standard deviations above the mean to
 exceed .Machine$integer.max.


 On Sun, Jan 19, 2020 at 1:58 PM Spencer Graves
 mailto:spencer.gra...@prodsyse.com>> wrote:

   This issue arose for me in simulations to estimate
 confidence, prediction, and tolerance intervals from glm(.,
 family=poisson) fits embedded in a BMA::bic.glm fit using a
 simulate.bic.glm function I added to the development version
 of Ecfun, available at "https://github.com/sbgraves237/Ecfun";
 <https://github.com/sbgraves237/Ecfun>. This is part of a
 vignette I'm developing, available at
 
"https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd";
 
<https://github.com/sbgraves237/Ecfun/blob/master/vignettes/time2nextNuclearWeaponState.Rmd>.
 This includes a simulated mean of a mixture of Poissons that
 exceeds 2e22.  It doesn't seem unreasonable to me to have
 rpois output a numerics rather than integers when a number
 simulated exceeds .Machine$integer.max.  And it does seem to
 make less sense in such cases to return NAs.


    Alternatively, might it make sense to add another
 argument to rpois to give the user the choice?  E.g., an
 argument "bigOutput" with (I hope) default = "numeric" and
 "NA" as a second option.  Or NA is the default, so no code
 that relied that feature of the current code would be broken
 by the change.  If someone wanted to use arbitrary precision

Re: [Rd] [External] Re: rpois(9, 1e10)

2020-01-22 Thread Spencer Graves




On 2020-01-22 02:54, Martin Maechler wrote:

Martin Maechler
 on Tue, 21 Jan 2020 09:25:19 +0100 writes:
Ben Bolker
 on Mon, 20 Jan 2020 12:54:52 -0500 writes:

 >> Ugh, sounds like competing priorities.

 > indeed.

 >> * maintain type consistency
 >> * minimize storage (= current version, since 3.0.0)
 >> * maximize utility for large lambda (= proposed change)
 >> * keep user interface, and code, simple (e.g., it would be easy enough
 >> to add a switch that provided user control of int vs double return 
value)
 >> * backward compatibility

 > Last night, it came to my mind that we should do what we have
 > been doing in quite a few places in R, the last couple of years:

 > Return integer when possible, and switch to return double when
 > integers don't fit.

 > We've been doing so even for  1:N  (well, now with additional ALTREP 
wrapper),
 > seq(), and even the fundamental  length()  function.

 > So I sat down and implemented it .. and it seemed to work
 > perfectly:  Returning the same random numbers as now, but
 > switching to use double (instead of returning NAs) when the
 > values are too large.

 > I'll probably commit that to R-devel quite soonish.
 > Martin

Committed in svn rev 77690; this is really very advantageous, as
in some cases / applications or even just limit cases, you'd
easily get into overflow sitations.

The new R 4.0.0 behavior is IMO  "the best of" being memory
efficient (integer storage) in most cases (back compatible to R 3.x.x) and
returning desired random numbers in large cases (compatible to R <= 2.x.x).

Martin



Wunderbar!  Sehr gut gemacht!  ("Wonderful!  Very well done!") Thanks, 
Spencer


 >> On 2020-01-20 12:33 p.m., Martin Maechler wrote:
  Benjamin Tyner
  on Mon, 20 Jan 2020 08:10:49 -0500 writes:
 >>>
 >>> > On 1/20/20 4:26 AM, Martin Maechler wrote:
 >>> >> Coming late here -- after enjoying a proper weekend ;-) --
 >>> >> I have been agreeing (with Spencer, IIUC) on this for a long
 >>> >> time (~ 3 yrs, or more?), namely that I've come to see it as a
 >>> >> "design bug" that  rpois() {and similar} must return return typeof() 
"integer".
 >>> >>
 >>> >> More strongly, I'm actually pretty convinced they should return
 >>> >> (integer-valued) double instead of NA_integer_   and for that
 >>> >> reason should always return double:
 >>> >> Even if we have (hopefully) a native 64bit integer in R,
 >>> >> 2^64 is still teeny tiny compared .Machine$double.max
 >>> >>
 >>> >> (and then maybe we'd have .Machine$longdouble.max  which would
 >>> >> be considerably larger than double.max unless on Windows, where
 >>> >> the wise men at Microsoft decided to keep their workload simple
 >>> >> by defining "long double := double" - as 'long double'
 >>> >> unfortunately is not well defined by C standards)
 >>> >>
 >>> >> Martin
 >>> >>
 >>> > Martin if you are in favor, then certainly no objection from me! ;-)
 >>>
 >>> > So now what about other discrete distributions e.g. could a similar
 >>> > enhancement apply here?
 >>>
 >>>
 >>> >> rgeom(10L, 1e-10)
 >>> >  [1] NA 1503061294 NA NA 1122447583 
NA
 >>> >  [7] NA NA NA NA
 >>> > Warning message:
 >>> > In rgeom(10L, 1e-10) : NAs produced
 >>>
 >>> yes, of course there are several such distributions.
 >>>
 >>> It's really something that should be discussed (possibly not
 >>> here, .. but then I've started it here ...).
 >>>
 >>> The  NEWS  for R 3.0.0 contain (in NEW FEATURES) :
 >>>
 >>> * Functions rbinom(), rgeom(), rhyper(), rpois(), rnbinom(),
 >>> rsignrank() and rwilcox() now return integer (not double)
 >>> vectors.  This halves the storage requirements for large
 >>> simulations.
 >>>
 >>> and what I've been suggesting is to revert this change
 >>> (svn rev r60225-6) which was purposefully and diligently done by
 >>> a fellow R core member, so indeed must be debatable.
 >>>
 >>> Martin
 >>>
 >>> __
 >>> R-devel@r-project.org mailing list
 >>> https://stat.ethz.ch/mailman/listinfo/r-devel
 >>>

 >> __
 >> R-devel@r-project.org mailing list
 >> https://stat.ethz.ch/mailman/listinfo/r-devel

 > __
 > R-devel@r-project.org mailing list
 > https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] matplot.Date & matplot.POSIXct

2020-01-24 Thread Spencer Graves



Hello, All:


  Roughly a decade ago, I added "matplot.Date" and 
"matplot.POSIXct" to the "fda" package, so we could get reasonable 
labeling of the horizontal axis when "x" was class "Date" or "POSIXct".  
I also added a local version of "matplot.default" that just changes the 
defaults for "xlab" and "ylab".



  Would anyone care to comment on this?


  In particular, might there be any interest among the R Core Team 
of adding "matplot.Date" and "matplot.POSIXct" to the "graphics" package?



  Secondarily, might anyone have any thoughts about the defaults 
for "xlab" and "ylab" in "graphics::matplot"?



  I ask, because the Jim Ramsay, Giles Hooker and I are preparing a 
new release of "fda", and Jim asked me if we needed to to have "matplot" 
masking "graphics::matplot".  Rather than answer that question, I 
thought I would ask a larger question of this group.



  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] matplot.Date & matplot.POSIXct

2020-01-27 Thread Spencer Graves

  Thanks for the reply.


On 2020-01-27 19:56, Abby Spurdle wrote:

Maybe I'm missing something really obvious here, but I was unable to
create a matrix out of POSIXct object(s).
Perhaps that deserves a separate discussion...?



  Can you provide an example?


  The standard matplot application that concerns me is with 
matplot(x, y, ...) where x has class Date or POSIXct and y is a matrix.  
The "fda" package on CRAN includes a "matplot" help page with examples 
that worked when I tested them recently.



  If you have an example that you think should work but doesn't I'd 
like to know.  Maybe it should be added to the examples in 
fda::matplot.Rd file, then the code should be modified until it works.


Regarding your other comments/questions:
(1) You should *NOT* mask functions from the graphics package (or
base, stats, etc), except possibly for personal use.
(2) The xlab and ylab are fine.



      In most situations, I agree with your comment that, "You should 
*NOT* mask functions from the graphics package (or base, stats, etc)".



  However, when the behavior of the function in graphics, base, or 
stats seems patently inappropriate and not adequately considered, then I 
think that someone should mask the function in the core distribution 
with one whose behavior seems more consistent with what most users would 
most likely want.



  Ten or twelve years ago, I concluded that the behavior of 
graphics::matplot(x, y, ...) was inappropriate when x is either of class 
Date or POSIXct.  Specifically, it labeled the horizontal axis the same 
as graphics::matplot(as.numeric(x), y, ...).  I think it should instead 
be labeled the same as graphics::plot(x, y[,1], ...) in such cases.  To 
fix this problem, I made fda::matplot generic; graphics::matplot is not 
generic.  And a coded methods for x of class numeric, matrix, Date and 
POSIXct plus a default.  Each calls either graphics::matplot or matlines 
as appropriate after first setting up the horizontal axis properly if x 
is of class Date or POSIXct.



  For specific examples, consider the following taken from 
fda::matplot.Rd:



invasion1 <- as.Date('1775-09-04')
invasion2 <- as.Date('1812-07-12')
earlyUS.Canada <- c(invasion1, invasion2)
Y <- matrix(1:4, 2, 2)
graphics::matplot(earlyUS.Canada, Y)
# horizontal axis labeled per as.numeric(earlyUS.Canada),
# NOT as Dates
fda::matplot(earlyUS.Canada, Y)
# problem fixed.


# POSIXct
AmRev.ct <- as.POSIXct1970(c('1776-07-04', '1789-04-30'))
graphics::matplot(AmRev.ct, Y)
# horizontal axis labeled per as.numeric(AmRev.ct),
# NOT as POSIXct
fda::matplot(AmRev.ct, Y)
# problem fixed.


  Comments?
  Thanks again for the reply.
  Spencer Graves


B.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] matplot.Date & matplot.POSIXct

2020-01-28 Thread Spencer Graves




On 2020-01-28 05:13, Martin Maechler wrote:

Spencer Graves
 on Mon, 27 Jan 2020 23:02:28 -0600 writes:





Still, as Abby mentioned,  turning a simple function into the
default method of an S3 generic is easy to do, but comes with a
bit of cost, not just S3 dispatch which typically is negligable in
graphics, but a bit of maintenance cost and mostly in this case
the cost of breaking back compatibility by the improvement.
How many plots will change where people have already relied on
the current   as.numeric(x)   behavior?
If we'd change this in R's graphics, it will be
- me and/or the CRAN team who have to contact CRAN package
   maintainer about problems
   (maybe none, as the change may not break any checks)

- Users of matplot() {& matlines() & matpoints()}  who may have to
   adopt their calls to these functions {I'm pretty sure all
   three would have to change for consistency}.

- and then, there are quite a few other changes,  bug
   assignments to which I have committed which should be
   dealt with rather before this.

If you'd turn this into a proper "wishlist"  "bug" report
on R's bugzilla, *and* you or another volunteer provided a patch
to the R sources (including changes to man/*.Rd, NAMESPACE, ..)
which then can be tested to pass 'make check-all',
then I'd definitely commit to this
(possibly too late for R 4.0.0;  teaching starts here soon, etc).



  1.  What do you suggest I do to get acceptable copies of 
~man/matplot.Rd and ~R/matplot.R -- and preferably the entire "graphics" 
package, so I can do R CMD build, check, etc., as I've done for 15 years 
or so with other R packages?



  2.  Then you'd like me to revise matplot.Rd to include 
appropriate examples that work fine with fda::matplot but malfunction 
with graphics::malfunction, then revise matplot.R so it fixed the 
problem?  And you want a fix that does NOT convert "matplot" to generic, 
and retains the current "as.numeric(x)" step except when inherits(x, 
"Date") or inherits(x, "POSIXct")?



  3.  Then you want me to submit a "wishlist" "bug" report to 
"https://bugs.r-project.org/bugzilla/index.cgi"; including all changes to 
matplot.Rd and matplot.R?  If I don't convert "matplot" to generic, then 
there should be no need for changes to NAMESPACE, correct?



  An answer to question "1" with "yes" to questions "2" and "3" 
should get me started.



  Thanks,
  Spencer Graves


Best,
Martin

 >   Thanks again for the reply.
 >   Spencer Graves
 >>
 >> B.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] ":::" operator doesn't work with data object Ecdat:::Crime

2020-03-16 Thread Spencer Graves
  The ":::" operator doesn't work for me with "Ecdat:::Crime" on 
either macOS 10.15.3 or Windows 10.



  A different but related issue is that "plm::Crime" says "Error: 
'Crime' is not an exported object from 'namespace:plm'", even though 
"library(plm); data(Crime); Crime" works.  I would naively think a user 
should be able to compare "Crime" objects documented in different 
packages using the "::" and ":::" operators, even if a package 
maintainer chooses not to "export" data objects.



  What do you think?


  Thanks,
  Spencer Graves


*** The following is from my Mac;  I could give you the comparable 
results from Windows 10 if you want it.



> dim(Ecdat::Crime)
[1] 630  24
> Ecdat:::Crime
Error in get(name, envir = asNamespace(pkg), inherits = FALSE) :
  object 'Crime' not found
> sessionInfo()
R version 3.6.3 (2020-02-29)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Catalina 10.15.3

Matrix products: default
BLAS: 
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets
[6] methods   base

loaded via a namespace (and not attached):
 [1] Rcpp_1.0.3 lattice_0.20-40
 [3] mvtnorm_1.1-0  BMA_3.18.12
 [5] Ecdat_0.3-7    rrcov_1.5-2
 [7] MASS_7.3-51.5  leaps_3.1
 [9] grid_3.6.3 pcaPP_1.9-73
[11] stats4_3.6.3   TeachingDemos_2.10
[13] Ecfun_0.2-4    robustbase_0.93-5
[15] xml2_1.2.5 Matrix_1.2-18
[17] splines_3.6.3  tools_3.6.3
[19] DEoptimR_1.0-8 jpeg_0.1-8.1
[21] survival_3.1-11    compiler_3.6.3
[23] inline_0.3.15  fda_2.4.8.1


[1] The six "Crime" objects I found were in the following packages: 
Ecdat,  BSDA, plm, mosaicModel, statisticalModeling, and gpk.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] status of Java & rJava?

2020-03-28 Thread Spencer Graves

Hello, All:


  Is Java being deprecated for R?


  I ask, because I've been unable to get rJava 0.9-11 to work under 
either macOS 10.15 or Windows 10, and I can't get rJava 0.9-12 to 
install -- and my Ecfun package uses it:   I can't get "R CMD build 
Ecfun" to work on my Mac nor "R CMD check Ecfun_0.2-4" under Windows.  
Travis CI builds "https://github.com/sbgraves237/Ecfun"; just fine.



  The rJava maintainer, Simon Urbanek, has kindly responded to two 
of my three emails on this since 2020-03-20, but I've so far been unable 
to translate his suggestions into fixes for these problems.



  Should I remove rJava from Ecfun and see what breaks, then see if 
I can work around that?  Should I provide the error messages I get for 
rJava from "update.packages()" and / or library(rJava) on both machines, 
with sessionInfo() to this list or to Stack Exchange or Stack Overflow?



  Since I'm getting so many problems with rJava on under both macOS 
and Windows 10, that suggests to me that potential users could have 
similar problems, and I should try to remove rJava from Ecfun.



  What do you think?
  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [BULK] Re: status of Java & rJava?

2020-03-28 Thread Spencer Graves

Hi, Simon et al.:


  My attempts to install rJava 0.9-12 from source failed under both 
macOS 10.15.4 and Windows 10.



  Below please find what I got just now trying "update.packages()" 
and selecting "install from sources" on both computers followed by 
"sessionInfo()" in each case.



  Thanks for your help.
  Spencer Graves


 update.packages()
rJava :
 Version 0.9-11 installed in 
/Library/Frameworks/R.framework/Versions/3.6/Resources/library

 Version 0.9-12 available at https://cran.rstudio.com
Update? (Yes/no/cancel) y
sf :
 Version 0.8-1 installed in 
/Library/Frameworks/R.framework/Versions/3.6/Resources/library

 Version 0.9-0 available at https://cran.rstudio.com
Update? (Yes/no/cancel) n
XLConnect :
 Version 0.2-15 installed in 
/Library/Frameworks/R.framework/Versions/3.6/Resources/library

 Version 1.0.1 available at https://cran.rstudio.com
Update? (Yes/no/cancel) n

  There is a binary version available but the
  source version is later:
  binary source needs_compilation
rJava 0.9-11 0.9-12  TRUE

Do you want to install from sources the package which needs compilation? 
(Yes/no/cancel) y

installing the source package ‘rJava’

trying URL 'https://cran.rstudio.com/src/contrib/rJava_0.9-12.tar.gz'
Content type 'application/x-gzip' length 1103629 bytes (1.1 MB)
==
downloaded 1.1 MB

* installing *source* package ‘rJava’ ...
** package ‘rJava’ successfully unpacked and MD5 sums checked
** using staged installation
checking for gcc... clang
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... configure: error: in 
`/private/var/folders/mh/mrm_14nx19g13lsnj9zmvwjrgn/T/Rtmpm3rVc5/R.INSTALL5d09696e941d/rJava':

configure: error: cannot run C compiled programs.
If you meant to cross compile, use `--host'.
See `config.log' for more details
ERROR: configuration failed for package ‘rJava’
* removing 
‘/Library/Frameworks/R.framework/Versions/3.6/Resources/library/rJava’
* restoring previous 
‘/Library/Frameworks/R.framework/Versions/3.6/Resources/library/rJava’


The downloaded source packages are in
‘/private/var/folders/mh/mrm_14nx19g13lsnj9zmvwjrgn/T/RtmpbxyWRI/downloaded_packages’
Warning message:
In install.packages(update[instlib == l, "Package"], l, repos = repos,  :
  installation of package ‘rJava’ had non-zero exit status
> sessionInfo()
R version 3.6.3 (2020-02-29)
Platform: x86_64-apple-darwin15.6.0 (64-bit)
Running under: macOS Catalina 10.15.4

Matrix products: default
BLAS: 
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
LAPACK: 
/Library/Frameworks/R.framework/Versions/3.6/Resources/lib/libRlapack.dylib


locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8

attached base packages:
[1] stats graphics  grDevices utils datasets
[6] methods   base

loaded via a namespace (and not attached):
[1] compiler_3.6.3 tools_3.6.3



> update.packages()
rJava :
 Version 0.9-11 installed in C:/Program Files/R/R-3.6.3/library
 Version 0.9-12 available at https://cran.rstudio.com

  There is a binary version available but
  the source version is later:
  binary source needs_compilation
rJava 0.9-11 0.9-12  TRUE

installing the source package 'rJava'

trying URL 'https://cran.rstudio.com/src/contrib/rJava_0.9-12.tar.gz'
Content type 'application/x-gzip' length 1103629 bytes (1.1 MB)
downloaded 1.1 MB

* installing *source* package 'rJava' ...
** package 'rJava' successfully unpacked and MD5 sums checked
** using staged installation
Generate Windows-specific files (src/jvm-w32) ...
make: Entering directory 
'/Users/spenc/AppData/Local/Temp/RtmpQbnYkA/R.INSTALL8ec5478248a/rJava/src/jvm-w32'
c:/Rtools/mingw_64/bin/dlltool --as c:/Rtools/mingw_64/bin/as 
--input-def jvm64.def --kill-at --dllname jvm.dll --output-lib libjvm.dll.a

c:/Rtools/mingw_64/bin/gcc  -O2 -c -o findjava.o findjava.c
c:/Rtools/mingw_64/bin/gcc  -s -o findjava.exe findjava.o
make: Leaving directory 
'/Users/spenc/AppData/Local/Temp/RtmpQbnYkA/R.INSTALL8ec5478248a/rJava/src/jvm-w32'

Find Java...
  JAVA_HOME=C:/PROGRA~1/Java/JRE18~1.0_2
=== Building JRI ===
  JAVA_HOME=C:/PROGRA~1/Java/JRE18~1.0_2
  R_HOME=C:/PROGRA~1/R/R-36~1.3
JDK has no javah.exe - using javac -h . instead
Creating Makefiles ...
Configuration done.
make -C src JRI.jar
make[1]: Entering directory 
'/Users/spenc/AppData/Local/Temp/RtmpQbnYkA/R.INSTALL8ec5478248a/rJava/jri/src'
C:/PROGRA~1/Java/JRE18~1.0_2/bin/javac -h . -d . ../RList.java 
../RBool.java ../RVector.java ../RMainLoopCallbacks.java 
../RConsoleOutputStream.java ../Mutex.java ../Rengine.java ../REXP.java 
../RFactor.ja

Re: [Rd] status of Java & rJava?

2020-03-28 Thread Spencer Graves




On 2020-03-28 23:07, Prof Brian Ripley wrote:

On 29/03/2020 04:07, Simon Urbanek wrote:

Spencer,

you could argue that Java is dead since Oracle effectively killed it 
by removing all public downloads, but if you manage to get hold of a 
Java installation then it works just fine with R. To my best 
knowledge there has never been an issue if you installed rJava from 
source. macOS Catalina has made binary distributions impossible due 
to additional restrictions on run-time, but even that has been how 
solved with the release of rJava 0.9-12, so please make sure you use 
the latest rJava. In most cases that I have seen issues were caused 
by incorrect configuration (setting JAVA_HOME incorrectly [do NOT set 
it unless you know what you're doing!], not installing Java for the 
same architecture as R etc.). If you have any issues feel free to 
report them. rJava 0.9-12 has quite a few changes that try to detect 
user errors better and report them so I strongly suggest users to 
upgrade.


There is OpenJDK, and https://adoptopenjdk.net provides binaries for 
macOS, including the preferred Java 11 LTS.  I just re-checked that, 
and after


env 
JAVA_HOME=/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home 
R CMD javareconf


I was able to install from source and check rJava 0.9-12 in 4.0.0 
alpha.  For the CRAN binary of 3.6.3 I had to make sure I was using 
clang 7: 'clang' defaults to that in the Apple CLT which does not 
support -fopenmp -- but the binary package just worked.


[All on Catalina.]


Thanks.  That worked on Catalina.  When installing OpenJDK on Windows 
10, The default for "Set JAVA_HOME" was 'X';  I changed that to 
install.  It didn't work at first, but did after I rebooted.



Thanks again to both Simon Urbanek and Prof. Ripley.  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] status of Java & rJava?

2020-03-29 Thread Spencer Graves
  I spoke too soon in saying that everything worked with OpenJDK:  
"R CMD check Ecfun_0.2-4.tar.gz" using 
"https://github.com/sbgraves237/Ecfun"; worked fine on my Mac but failed 
with "error: DLL 'rJava' not found: maybe not installed for this 
architecture?" under Windows 10.  "00install.out" and 
"Sys.getenv('PATH')" follow.  "library(rJava)" seemed to work, and 
"help(pac='rJava') displays 0.9-12.  Suggestions?  Thanks, Spencer Graves



* installing *source* package 'Ecfun' ...
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
*** arch - i386
Error: package or namespace load failed for 'Ecfun':
 .onLoad failed in loadNamespace() for 'rJava', details:
  call: library.dynam("rJava", pkgname, libname)
  error: DLL 'rJava' not found: maybe not installed for this architecture?
Error: loading failed
Execution halted
*** arch - x64
ERROR: loading failed for 'i386'
* removing 'C:/Users/spenc/Documents/R/Ecfun/Ecfun.Rcheck/Ecfun'

##

> Sys.getenv('PATH')
[1] "C:\\Program Files\\R\\R-3.6.3\\bin\\x64;C:\\Program 
Files\\AdoptOpenJDK\\jdk-11.0.6.10-hotspot\\bin;C:\\Program 
Files\\Java\\jre1.8.0_241;C:\\Rtools\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program 
Files\\Microsoft VS Code\\bin;C:\\Program Files\\Git\\cmd;C:\\Program 
Files\\TortoiseSVN\\bin;c:\\programFiles\\ffmpeg\\ffmpeg-4.1\\;C:\\Program 
Files\\Pandoc\\;C:\\Program Files\\MiKTeX 
2.9\\miktex\\bin\\x64\\;C:\\Users\\spenc\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\spenc\\AppData\\Local\\GitHubDesktop\\bin;C:\\FFmpeg\\bin;C:\\RBuildTools\\3.5\\bin;C:\\Program 
Files\\R\\R-3.6.3\\bin;C:\\Users\\spenc\\AppData\\Local\\Microsoft\\WindowsApps;C:\\RBuildTools\\3.5\\;"




On 2020-03-28 23:07, Prof Brian Ripley wrote:

On 29/03/2020 04:07, Simon Urbanek wrote:

Spencer,

you could argue that Java is dead since Oracle effectively killed it 
by removing all public downloads, but if you manage to get hold of a 
Java installation then it works just fine with R. To my best 
knowledge there has never been an issue if you installed rJava from 
source. macOS Catalina has made binary distributions impossible due 
to additional restrictions on run-time, but even that has been how 
solved with the release of rJava 0.9-12, so please make sure you use 
the latest rJava. In most cases that I have seen issues were caused 
by incorrect configuration (setting JAVA_HOME incorrectly [do NOT set 
it unless you know what you're doing!], not installing Java for the 
same architecture as R etc.). If you have any issues feel free to 
report them. rJava 0.9-12 has quite a few changes that try to detect 
user errors better and report them so I strongly suggest users to 
upgrade.


There is OpenJDK, and https://adoptopenjdk.net provides binaries for 
macOS, including the preferred Java 11 LTS.  I just re-checked that, 
and after


env 
JAVA_HOME=/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home 
R CMD javareconf


I was able to install from source and check rJava 0.9-12 in 4.0.0 
alpha.  For the CRAN binary of 3.6.3 I had to make sure I was using 
clang 7: 'clang' defaults to that in the Apple CLT which does not 
support -fopenmp -- but the binary package just worked.


[All on Catalina.]


Thanks.  That worked on Catalina.  When installing OpenJDK on Windows 
10, The default for "Set JAVA_HOME" was 'X';  I changed that to 
install.  It didn't work at first, but did after I rebooted.



Thanks again to both Simon Urbanek and Prof. Ripley.  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] status of Java & rJava?

2020-03-30 Thread Spencer Graves
  Tomas Kalibera kindly suggested I might have both 32- and 64-bit 
Java installed, and it might be accessing the 32-bit.  He further 
suggested:



   R CMD check Ecfun_0.2-4.tar.gz --no-multiarch


  That worked.  Thanks, Thomas.


  Spencer

On 2020-03-29 08:03, Spencer Graves wrote:
I spoke too soon in saying that everything worked with OpenJDK: "R CMD 
check Ecfun_0.2-4.tar.gz" using "https://github.com/sbgraves237/Ecfun"; 
worked fine on my Mac but failed with "error: DLL 'rJava' not found: 
maybe not installed for this architecture?" under Windows 10.  
"00install.out" and "Sys.getenv('PATH')" follow.  "library(rJava)" 
seemed to work, and "help(pac='rJava') displays 0.9-12.  Suggestions?  
Thanks, Spencer Graves



* installing *source* package 'Ecfun' ...
** using staged installation
** R
** inst
** byte-compile and prepare package for lazy loading
** help
*** installing help indices
** building package indices
** installing vignettes
** testing if installed package can be loaded from temporary location
*** arch - i386
Error: package or namespace load failed for 'Ecfun':
 .onLoad failed in loadNamespace() for 'rJava', details:
  call: library.dynam("rJava", pkgname, libname)
  error: DLL 'rJava' not found: maybe not installed for this 
architecture?

Error: loading failed
Execution halted
*** arch - x64
ERROR: loading failed for 'i386'
* removing 'C:/Users/spenc/Documents/R/Ecfun/Ecfun.Rcheck/Ecfun'

##

> Sys.getenv('PATH')
[1] "C:\\Program Files\\R\\R-3.6.3\\bin\\x64;C:\\Program 
Files\\AdoptOpenJDK\\jdk-11.0.6.10-hotspot\\bin;C:\\Program 
Files\\Java\\jre1.8.0_241;C:\\Rtools\\bin;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program 
Files\\Microsoft VS Code\\bin;C:\\Program Files\\Git\\cmd;C:\\Program 
Files\\TortoiseSVN\\bin;c:\\programFiles\\ffmpeg\\ffmpeg-4.1\\;C:\\Program 
Files\\Pandoc\\;C:\\Program Files\\MiKTeX 
2.9\\miktex\\bin\\x64\\;C:\\Users\\spenc\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\spenc\\AppData\\Local\\GitHubDesktop\\bin;C:\\FFmpeg\\bin;C:\\RBuildTools\\3.5\\bin;C:\\Program 
Files\\R\\R-3.6.3\\bin;C:\\Users\\spenc\\AppData\\Local\\Microsoft\\WindowsApps;C:\\RBuildTools\\3.5\\;" 





On 2020-03-28 23:07, Prof Brian Ripley wrote:

On 29/03/2020 04:07, Simon Urbanek wrote:

Spencer,

you could argue that Java is dead since Oracle effectively killed it 
by removing all public downloads, but if you manage to get hold of a 
Java installation then it works just fine with R. To my best 
knowledge there has never been an issue if you installed rJava from 
source. macOS Catalina has made binary distributions impossible due 
to additional restrictions on run-time, but even that has been how 
solved with the release of rJava 0.9-12, so please make sure you use 
the latest rJava. In most cases that I have seen issues were caused 
by incorrect configuration (setting JAVA_HOME incorrectly [do NOT 
set it unless you know what you're doing!], not installing Java for 
the same architecture as R etc.). If you have any issues feel free 
to report them. rJava 0.9-12 has quite a few changes that try to 
detect user errors better and report them so I strongly suggest 
users to upgrade.


There is OpenJDK, and https://adoptopenjdk.net provides binaries for 
macOS, including the preferred Java 11 LTS.  I just re-checked that, 
and after


env 
JAVA_HOME=/Library/Java/JavaVirtualMachines/adoptopenjdk-11.jdk/Contents/Home 
R CMD javareconf


I was able to install from source and check rJava 0.9-12 in 4.0.0 
alpha.  For the CRAN binary of 3.6.3 I had to make sure I was using 
clang 7: 'clang' defaults to that in the Apple CLT which does not 
support -fopenmp -- but the binary package just worked.


[All on Catalina.]


Thanks.  That worked on Catalina.  When installing OpenJDK on Windows 
10, The default for "Set JAVA_HOME" was 'X';  I changed that to 
install.  It didn't work at first, but did after I rebooted.



Thanks again to both Simon Urbanek and Prof. Ripley.  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] How to find detritis rejected by "R CMD check" on Debian?

2020-04-17 Thread Spencer Graves

Hello:


  How can someone help me find and fix the following, contained in 
00check.log on Debian for "https://github.com/JamesRamsay5/fda":



NOTE
Found the following files/directories:
  ‘fdaMatlabPath.m’
* checking for detritus in the temp directory ... OK


  See:


https://win-builder.r-project.org/incoming_pretest/fda_5.1.3_20200416_225207/Debian/00check.log


  Thanks,
      Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] "not a valid win32 application" with rtools40-x86_65.exe on Windows 10

2020-04-29 Thread Spencer Graves

Hello, All:


  "00install.out" from "R CMD check Ecfun_0.2-4.tar.gz" includes:


Error:  package or namespace load failed for 'Ecfun':
 .onLoad failed in loadNamespace() for 'rJava', details
  call: inDL(x, as.logical(local), as.logical(now), ...)
  error:  unable to load shared object 'c:/Program 
Files/R/R-4.0.0/library/rJava/libs/i386/rJava.dll':

  LoadLibrary failure: ^1 is not a valid win32 application


  This was after installing R 4.0.0 and "rtools40-x86_64.exe" under 
Windows 10 Pro 64-bit.



  Suggestions?
  Thanks,
  Spencer Graves


sessionInfo()
R version 4.0.0 (2020-04-24)
Platform: x86_64-64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 18362)

Matrix products: default

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CCTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats   graphics   grDevices  utils   datasets   methods   base

loaded via a namespace (and not attached):
[1] compiler_4.0.0

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "not a valid win32 application" with rtools40-x86_65.exe on Windows 10

2020-04-30 Thread Spencer Graves

Hi, Jeroen et al.:


On 2020-04-30 03:15, Jeroen Ooms wrote:

On Thu, Apr 30, 2020 at 6:38 AM Spencer Graves
 wrote:

Hello, All:


"00install.out" from "R CMD check Ecfun_0.2-4.tar.gz" includes:


Error:  package or namespace load failed for 'Ecfun':
   .onLoad failed in loadNamespace() for 'rJava', details
call: inDL(x, as.logical(local), as.logical(now), ...)
error:  unable to load shared object 'c:/Program
Files/R/R-4.0.0/library/rJava/libs/i386/rJava.dll':
LoadLibrary failure: ^1 is not a valid win32 application


This is an error in loading the rJava package, so it is not related to
rtools40, and probably inappropriate for this mailing list.

As Simon suggested, you may have to install the 32-bit Java JDK. See
also this faq: 
https://github.com/r-windows/docs/blob/master/faq.md#how-to-install-rjava-on-windows



  In fact I had both 32- and 64-bit Java installed but only the 
64-bit was in the path.  I added the 32-bit, but that did not fix the 
problem.  The last 2.5 lines in the section "How to install rJava on 
Windows?" to which you referred me reads:



to build rJava from source, you need the --merge-multiarch flag:

install.packages('rJava', type = 'source', INSTALL_opts='--merge-multiarch')


  When I tried that, I got:


Warning in system("sh ./configure.win") : 'sh' not found


*** ON THE OTHER HAND:  The error message above says 'c:/Program
Files/R/R-4.0.0/library/rJava/libs/i386/rJava.dll':
   LoadLibrary failure: ^1 is not a valid win32 application


>>>> Is "rJava.dll" a valid win32 application?


  Suggestions?
  Thanks,
  Spencer Graves


p.s.  A similar problem with rJava a month ago was fixed by installed 
64-bit Java.  Now with the upgrade to R 4.0.0 and rtools40, this no 
longer works.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Tips for debugging: R CMD check examples

2010-06-30 Thread Spencer Graves

Hi, Hadley:


  1.  Is your memory clear when you run the examples successfully?  
I.e., start with "rm(list=objects())", then try the examples that work 
interactively but fail in "R CMD check".



  2.  Also, does "R CMD check" give any warnings, e.g., undefined 
global, in addition to the error message?  If yes, fixing those might 
also fix the error message.



  The suggestions of Deepayan and Sarker are excellent, but the two 
things I suggests are more targeted and might work, so I'd try them first.



  Hope this helps.
  Spencer


On 6/30/2010 1:05 AM, Deepayan Sarkar wrote:

On Wed, Jun 30, 2010 at 3:26 AM, Hadley Wickham  wrote:
   

Hi all,

Does anyone have any suggestions for debugging the execution of
examples by R CMD check?  The examples work fine when I run them from
a live R prompt, but I get errors when they are run by R CMD check.
 

'R CMD check pkg' will produce a pkg.Rcheck/pkg-Ex.R file that
collects the examples into a single file (and also does other things).
You could try running that file interactively to see if the error is
reproduced.

-Deepayan

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
   



--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] perl.exe has stopped working

2010-08-18 Thread Spencer Graves

 Hello:


  I just installed 14 security updates for Vista x64, and now "R 
CMD build packagename" terminates, saying, "perl.exe has stopped 
working".  I reinstalled Rtools211 using the latest version after 
uninstalling the version I installed on 4/3/2010.



  What do you suggest?  I can install the latest version of perl 
from "www.perl.org" (5.12.1), but I thought I'd ask here first.



  Thanks,
  Best Wishes,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] c.POSIXct

2010-08-18 Thread Spencer Graves
   I'm with Gabor on this.  I naively would not expect c() to strip 
attributes generally, and I've been surprise more than once to find the 
time zone attribute stripped when I did not expect that.



  Might it make sense to add an argument like 
"keepAttributes=FALSE" to the "c" function?  Then people like Gabor and 
me would know that we would have to specify "keepAttributes = TRUE" if 
we wanted attributes to be kept.  Having this in the documentation would 
also help advertise the default behavior.  I would expect that 
attributes like "dim" and "dimnames" should always be dropped, 
regardless of the value of "keepAttributes".  With "keepAttributes = 
TRUE", "names" would be concatenated, and other attributes would be 
taken from the first argument with other attributes of later arguments 
ignored.



QUESTIONS:


  1.  With POSIXct, isn't the numeric part always in GMT, 
regardless of time zone?  Then the "tzone" attribute only affects the 
display?  Consider the following:



> (z <- Sys.time())
[1] "2010-08-18 21:16:38 PDT"
> as.numeric(z)
[1] 1282191399
> attr(z, 'tzone') <- 'GMT'
> as.numeric(z)
[1] 1282191399
> z
[1] "2010-08-19 04:16:38 GMT"


  2.  How can one specify a time zone other than "GMT" and the 
default local time zone?


> attr(z, 'tzone') <- Sys.timezone()
> z
[1] "2010-08-19 04:16:38 GMT"
Warning message:
In as.POSIXlt.POSIXct(x, tz) : unknown timezone 'PDT'


  Thanks,
  Spencer Graves


On 8/18/2010 7:53 PM, Gabor Grothendieck wrote:

On Wed, Aug 18, 2010 at 10:34 PM, Simon Urbanek
  wrote:

On Aug 18, 2010, at 6:23 PM, Gabor Grothendieck wrote:


No one answered this so I submitted it to the bugs system and there I
got the response that it is documented behavior; however, whether its
documented or not is hardly the point -- its undesirable that tzone is
lost when using c.


That's one man's opinion - from docs

  if you want to specify an object in a particular timezone but
   to be printed in the current timezone you may want to remove the
   ‘"tzone"’ attribute (e.g. by ‘c(x)’).

so apparently that is a design choice and hence I doubt it can be changed as it would 
break code that uses that feature. As many things in R whether it was a good choice is up 
for debate but it has been made already. (Think about how you would combine different 
times with different tzones - there is no "right" way to do so and thus 
stripping is very plausible and consistent)


I did already address the ambiguity point in the suggested code that I
posted.  It only maintains the tzone if there is no ambiguity.

Note that there have been significant changes in POSIXt relatively
recently, namely switching POSIXt and POSIXct class order, so it seems
that such changes are not beyond possibility.

At any rate, the underlying objective of getting time zones to work in
the expected way still seems desirable.  If backward compatibility is
to be invoked (even though it wasnt in the case I just cited) then it
would still be possible to address this with a new core class or
subclass.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] c.POSIXct

2010-08-19 Thread Spencer Graves

 Hi, Gabor, Paul, et al.:


  For classes that did not supply a "ca" method, I'd rather see the 
default being to start with the corresponding  "c" method followed by an 
effort to preserve attributes to the maximum extent feasible.  I'm not 
sure the best defaults, but at the moment, I would expect that 
attributes like "dim" and "dimnames" should always be dropped.  The 
current default "c" method preserves "names" appropriately.



  What do you think about including another argument to 
"checkAttributes"?  I'm not sure what the options should be nor what 
should be the default, but one option should throw and error if any 
contradiction was found while another would take all attributes from the 
first argument and ignore others.



  Best Wishes,
  Spencer Graves


On 8/19/2010 7:32 AM, Gabor Grothendieck wrote:

On Thu, Aug 19, 2010 at 10:16 AM, Paul Gilbert
  wrote:

I used to get caught by this c() behaviour often, but now I do expect it to 
drop attributes. I think it would break many things if you change it, and force 
people to write different code when they really do want to drop attributes. 
When you want new behaviour it is usually better to define a new function, ca() 
maybe?


That would work if ca defaulted to c for those classes that did not
supply a ca method.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] c.POSIXct

2010-08-19 Thread Spencer Graves

 Hi, Gabor, et al.:


  I'm suggesting adding "checkAttributes" to "ca", NOT to "c".


  Spencer


On 8/19/2010 8:50 AM, Gabor Grothendieck wrote:

On Thu, Aug 19, 2010 at 11:43 AM, Spencer Graves
  wrote:

  Hi, Gabor, Paul, et al.:


  For classes that did not supply a "ca" method, I'd rather see the
default being to start with the corresponding  "c" method followed by an
effort to preserve attributes to the maximum extent feasible.  I'm not sure
the best defaults, but at the moment, I would expect that attributes like
"dim" and "dimnames" should always be dropped.  The current default "c"
method preserves "names" appropriately.


  What do you think about including another argument to
"checkAttributes"?  I'm not sure what the options should be nor what should
be the default, but one option should throw and error if any contradiction
was found while another would take all attributes from the first argument
and ignore others.



I think ca is easier to use.  It would work consistently and simply
across classes whereas fixing up c with checkAttribues is harder to
use.




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] perl.exe has stopped working: Fixed.

2010-08-19 Thread Spencer Graves
   To complete this thread for anyone encountering a similar 
problem, the fix I said I'd try (below) worked.  Specifically, I 
installed the latest Strawberry Perl, modified the path so the first 
reference points to the new Perl, and R CMD build, check, INSTALL, and 
"INSTALL -build" all seemed to work properly.  Of course, there is 
always a chance of some deeply hidden problem, but the obvious first 
tests seemed to function exactly as I have come to expect.



  Best Wishes,
  Spencer Graves


On 8/18/2010 6:00 PM, Spencer Graves wrote:

 Hello:


  I just installed 14 security updates for Vista x64, and now "R 
CMD build packagename" terminates, saying, "perl.exe has stopped 
working".  I reinstalled Rtools211 using the latest version after 
uninstalling the version I installed on 4/3/2010.



  What do you suggest?  I can install the latest version of perl 
from "www.perl.org" (5.12.1), but I thought I'd ask here first.



      Thanks,
  Best Wishes,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] No RTFM?

2010-08-19 Thread Spencer Graves
 What do you think about adding a "No RTFM" policy to the R mailing 
lists? Per, "http://en.wikipedia.org/wiki/RTFM":



The Ubuntu Forums and LinuxQuestions.org, for instance, have instituted 
"no RTFM" policies to promote a welcoming atmosphere.[8][9].


RTFM [and] "Go look on google" are two inappropriate responses to a 
question. If you don't know the answer or don't wish to help, please say 
nothing instead of brushing off someone's question. Politely showing 
someone how you searched or obtained the answer to a question is 
acceptable, even encouraged.

...

If you wish to remind a user to use search tools or other resources when 
they have asked a question you feel is basic or common, please be very 
polite. Any replies for help that contain language disrespectful towards 
the user asking the question, i.e. "STFU" or "RTFM" are unacceptable and 
will not be tolerated. —Ubuntu Forums



Gavin Simpson and I recently provided examples answering a question from 
"r.ookie" that had previously elicited responses, ""You want us to read 
the help page to you?" and "It yet again appears that you are asking us 
to read the help pages for you."



I can appreciate the sentiment in fortunes('rtfm'). In this case, 
however, "r.ookie" had RTFM (and said so), but evidently the manual was 
not sufficiently clear.



Best Wishes,
Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] No RTFM?

2010-08-20 Thread Spencer Graves

 Hi, Gabor, et al.:


  Can anyone comment on the experience of the Ubuntu Forums and 
LinuxQuestions.org, mentioned in the Wikipedia article I cited?



  Gabor makes an interesting point.  However, logic without data is 
a very poor tool for decision making, because great sounding assumptions 
have often led to conclusions that sound great but are 
counterproductive.  People with experience with the Ubuntu Forums and 
LinuxQuestions.org should be able to provide some insight here.



  Best Wishes,
  Spencer


On 8/20/2010 11:37 AM, Gabor Grothendieck wrote:

On Fri, Aug 20, 2010 at 2:12 PM, Paul Johnson  wrote:

On Thu, Aug 19, 2010 at 7:08 PM, Spencer Graves
  wrote:

  What do you think about adding a "No RTFM" policy to the R mailing lists?
Per, "http://en.wikipedia.org/wiki/RTFM":


I think this is a great suggestion.

I notice the R mailing list already has a gesture in this direction:
"Rudeness and ad hominem comments are not acceptable. Brevity is OK."

But the people who behave badly don't care about policies like this
and they will keep doing what they do.

Although it may seem hard to justify rudeness its often the case that
even the most bizarre behavior makes sense if you view it from the
perspective of that person.   In the case of the R list there is a
larger potential demand for free help than resources to answer and
without the usual monetary economics to allocate resources I believe
that the functional purpose of rudeness here is to ration those
resources and minimize duplication of questions.  If that is correct
one can predict that if civility were to become the norm on this list
then other rationing mechanisms would arise to replace it.

For example, it might become the norm that most questions are not
answered or are answered less thoroughly or the list might be replaced
as the de facto goto medium for R questions by some other list or web
site so we have to be careful about unintended consequences.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] No RTFM?

2010-08-21 Thread Spencer Graves

 Hello, All:


  I think there is a logic to Gabor's perspective, especially 
regarding unintended consequences.



  For example, if the as a result of changing policy, our most 
creative and substantive contributors decide to reduce their level of 
contribution and are not effectively replaced by others, then it would 
be a great loss for humanity.



  This group, especially the R Core team and the R-devel community 
more generally, has been incredibly productive.  The result is a 
substantive contribution to humanity.  It would be a loss if any change 
reduced that.  However, if rudeness is driving away potential 
contributors as was claimed, then this community might be more 
productive with a "no RTFM" policy.



  I accept that the experience of the Ubuntu Forums and 
LinuxQuestions.org may not be perfectly relevant to R, but I think they 
could provide some insight:  I would expect them to have some of the 
same "rationing" problems as experienced on the R help lists.



  The exchange that generated my original comment on this was a 
question from "r.ookie" to R-Help.  I don't know why this person chose 
to hide their real identity, but I was subsequently informed off line 
that the RTFM comment I saw was a response to an apparently rude reply 
by "r.ookie" to a previous suggestion by a regular contributor.  I still 
think a better response is not to escalate:  Either ignore the post or 
say something like, "I don't understand your question.  Please provide a 
self-contained minimal example as suggested in the Posting Guide ... ."



  Best Wishes,
  Spencer


On 8/21/2010 2:08 AM, Simone Giannerini wrote:

Dear Gabor,

I do not agree with your claim

"In the case of the R list there is a
larger potential demand for free help than resources to answer and
without the usual monetary economics to allocate resources I believe
that the functional purpose of rudeness here is to ration those
resources and minimize duplication of questions"

In fact, apart from the fact that rudeness should never be justified,  I was
amazed at the amount of time dedicated by some people to give unhelpful
replies to dumb (and less dumb) questions (at least on R-devel). In my
opinion this behaviour causes some damages to the whole R project for at
least two reasons:

1. On the bug report side if you want to have a good percentage of true
positive reports you should allow for a high percentage of false positive
reports.  But if people are scared to post you will lose the true positive
together with false ones.
2. People that are potentially willing to contribute are discouraged to do
it.

Kind regards

Simone

On Fri, Aug 20, 2010 at 8:37 PM, Gabor Grothendieck
wrote:
On Fri, Aug 20, 2010 at 2:12 PM, Paul Johnson
wrote:

On Thu, Aug 19, 2010 at 7:08 PM, Spencer Graves
  wrote:

  What do you think about adding a "No RTFM" policy to the R mailing

lists?

Per, "http://en.wikipedia.org/wiki/RTFM":


I think this is a great suggestion.

I notice the R mailing list already has a gesture in this direction:
"Rudeness and ad hominem comments are not acceptable. Brevity is OK."

But the people who behave badly don't care about policies like this
and they will keep doing what they do.

Although it may seem hard to justify rudeness its often the case that
even the most bizarre behavior makes sense if you view it from the
perspective of that person.   In the case of the R list there is a
larger potential demand for free help than resources to answer and
without the usual monetary economics to allocate resources I believe
that the functional purpose of rudeness here is to ration those
resources and minimize duplication of questions.  If that is correct
one can predict that if civility were to become the norm on this list
then other rationing mechanisms would arise to replace it.

For example, it might become the norm that most questions are not
answered or are answered less thoroughly or the list might be replaced
as the de facto goto medium for R questions by some other list or web
site so we have to be careful about unintended consequences.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] No RTFM?

2010-08-21 Thread Spencer Graves
 I've answered many email posts by copying and editing the email 
footer.  That's much more friendly, informative and effective than just 
RTFM.  (As previously noted in this thread, it's often hard to know 
which FMTR.)



Spencer


On 8/21/2010 6:08 PM, Gabor Grothendieck wrote:

On Sat, Aug 21, 2010 at 8:59 PM, Hadley Wickham  wrote:

Regarding length, the portion at the end of every r-help message (but
this does not appear at the end of r-devel messages or the messages
of other lists concerning R):

   "provide commented, minimal, self-contained, reproducible code."

It was intended to provide a one line synopsis of the key part of the posting
guide that could be readily pointed to.  Although we have to be careful about
making that too verbose, as well, it might not be too onerous to add

But no one reads email footers...


I would expect that a lot more people read that than the posting guide.

Its also useful as something to point to that is more accessible than
the posting guide since its right there.

One can be sure its been received since every message contains it.

Finally it gives the key info that someone needs to effectively use r-help.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] reliability of R-Forge? (moving to r-Devel)

2010-08-26 Thread Spencer Graves

 Hello:


  Can anyone comment on plans for R-Forge?  Please see thread below.


  Ramsay, Hooker and I would like to release a new version of "fda" 
to CRAN.  We committed changes for it last Friday.  I'd like to see 
reports of their "daily checks", then submit to CRAN from R-Forge.  
Unfortunately, it seems to be down now, saying "R-Forge Could Not 
Connect to Database:".  I just tried, 'install.packages("fda", 
repos="http://R-Forge.R-project.org";)', and got the previous version, 
which indicates that my changes from last Friday have not been built 
yet.  Also, a few days ago, I got an error from 
'install.packages("pfda", repos="http://R-Forge.R-project.org";)' (a 
different package, 'pfda' NOT 'fda').  I don't remember the error 
message, but this same command worked for me just now.



  I infer from this that I should consider submitting the latest 
version of 'fda' to CRAN manually, not waiting for the R-Forge 
[formerly] "daily" builds and checks.



  R-Forge is an incredibly valuable resource.  It would be even 
more valuable if it were more reliable.  I very much appreciate the work 
of the volunteers who maintain it;  I am unfortunately not in a position 
to volunteer to do more for the R-Project generally and R-Forge in 
particular than I already do.



  Thanks,
  Spencer Graves


On 8/26/2010 1:07 AM, Jari Oksanen wrote:

David Kane  kanecap.com>  writes:


How reliable is R-Forge? http://r-forge.r-project.org/

It is down now (for me). Reporting "R-Forge Could Not Connect to Database: "

I have just started to use it for a project. It has been down for
several hours (at least) on different occasions over the last couple
of days. Is that common? Will it be more stable soon?

Apologies if this is not an appropriate question for R-help.


Dave,

This is rather a subject for R-devel. Ignoring this inappropriateness: yes,
indeed, R-Forge has been flaky lately. The database was disconnected for the
whole weekend, came back on Monday, and is gone again. It seems that mailing
lists and email alerts of commits were not working even when the basic R-Forge
was up.

I have sent two messages to r-fo...@r-project.org on these problems. I haven't
got no response, but soon after the first message the Forge woke up, and soon
after the second message it went down. Since I'm not Bayeasian, I don't know
what to say about the effect of my messages.

Cheers, Jari Oksanen

__
r-h...@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] reliability of R-Forge? (moved to R-Devel)

2010-08-26 Thread Spencer Graves
   I earlier moved a part of this thread to R-Devel, and got some 
replies there.



  At least one page on R-Forge says, "We are currently adapting the 
R-packages-plugin in order to work together with the new FusionForge 
infrastructure. Some services are thus not yet available."  I don't know 
if R-Forge is accepting new volunteers, but it looks like they could use 
help.  Unfortunately, I'm not in a position to volunteer.



  Best Wishes,
  Spencer Graves


On 8/26/2010 8:28 AM, R P Herrold wrote:

On Thu, 26 Aug 2010, Gavin Simpson wrote:


On Thu, 2010-08-26 at 02:30 -0400, David Kane wrote:

How reliable is R-Forge? http://r-forge.r-project.org/

It is down now (for me). Reporting "R-Forge Could Not Connect to 
Database: "


late to chime in, so had tossed the first piece.  As this relates to 
'reliability of R-Forge' in the sense of possible process issues, 
rather than availability of the archive, I wanted to 'tag into' this 
thread


I 'mirror' r-forge, so I have not seen this ...

One thing I note, mirroring r-forge, and processing 'diffs' netween 
successive days, is that the md5sums of some packages regularly change 
without version number bumps.  From this morning's report in my email:


Thu Aug 26 04:30:01 EDT 2010

--- /tmp/rforge-pre.txt 2010-08-26 04:30:33.0 -0400
+++ /tmp/rforge-post.txt2010-08-26 04:38:03.0 -0400
@@ -8,18 +8,18 @@
 AquaEnv_1.0-1.tar.gz   615059a5369d1aba149e6142fedffdde
 ArvoRe_0.1.6.tar.gzc955ae7c64c4270740172ad2219060ff
 BB_2010.7-1.tar.gz 4f85093ab24fac5c0b91539ec6efb8b7
-BCE_2.0.tar.gz 5a3fe3ecabbe2b2e278f6a48fc19d18d
-BIOMOD_1.1-5.tar.gzd2f74f21bc8858844f8d71627fd8e687
+BCE_2.0.tar.gz 65a968c586e729a1c1ca34a37f5c293a
+BIOMOD_1.1-5.tar.gz6929e5ad6a14709de7065286ec684942
 ...
-BTSPAS_2010.08.tar.gz  16b8f265846a512c329f0b52ba1924ab
+BTSPAS_2010.08.tar.gz  809a96b11f1094e95b217af113abd0ac
 ...
-BayesR_0.1-1.tar.gz72bd41c90845032eb9d15c4c6d086dec
+BayesFactorPCL_0.5.tar.gz  173ab741c399309314eff240a4c3cd6f
+BayesR_0.1-1.tar.gz9560b511f1b955a60529599672d58fea
 ...
-BiplotGUI_0.0-6.tar.gz 594b3a275cde018eaa74e1ef974dd522
+BiplotGUI_0.0-6.tar.gz 857a484fdba6cb97be4e42e38bb6d0fd
 ...
-IsoGene_1.0-18.tar.gz  679a5aecb7182474ed6a870fa52ca2e3
+IsoGene_1.0-18.tar.gz  f37572957b2a9846a8d738ec88ac8690

and so forth.  I've not taken the trime to understand why seemingly 
new versions are appearing without version bumps yet.


Is anyone aware of explanations, other than a release process that 
does not require unique versioning of differing content? [it seems 
pretty basic to me that a 'receiver' of new content could do the 
checks I do, and decline to push conflicting md5sums over an 
identically named prior candidate in archive]


-- Russ herrold

__
r-h...@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide 
http://www.R-project.org/posting-guide.html

and provide commented, minimal, self-contained, reproducible code.



--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] reliability of R-Forge? (moved to R-Devel)

2010-08-26 Thread Spencer Graves

Hi, Russ:


	  As noted by Brian Peterson in a separate email, R-Forge has a 
"Revision" number in addition to the version number.  For example, the 
'fda' package is currently at version 2.2.3 with Rev.: 484 on R-Forge. 
Each SVN Commit increments the Rev. number [after a successful build, I 
think], but the version number only changes if that change includes a 
change in the revision number in the DESCRIPTION.  I don't know for 
sure, but I assume that the "md5sums" probably changes with each Rev.



  If this is NOT correct, I hope someone who knows will clarify this.


  Best Wishes,
  Spencer



   I earlier moved a part of this thread to R-Devel, and got some 
replies there.



  At least one page on R-Forge says, "We are currently adapting the 
R-packages-plugin in order to work together with the new FusionForge 
infrastructure. Some services are thus not yet available."  I don't know 
if R-Forge is accepting new volunteers, but it looks like they could use 
help.  Unfortunately, I'm not in a position to volunteer.



  Best Wishes,
  Spencer Graves


On 8/26/2010 8:28 AM, R P Herrold wrote:

On Thu, 26 Aug 2010, Gavin Simpson wrote:


On Thu, 2010-08-26 at 02:30 -0400, David Kane wrote:

How reliable is R-Forge? http://r-forge.r-project.org/

It is down now (for me). Reporting "R-Forge Could Not Connect to
Database: "


late to chime in, so had tossed the first piece.  As this relates to
'reliability of R-Forge' in the sense of possible process issues,
rather than availability of the archive, I wanted to 'tag into' this
thread

I 'mirror' r-forge, so I have not seen this ...

One thing I note, mirroring r-forge, and processing 'diffs' netween
successive days, is that the md5sums of some packages regularly change
without version number bumps.  From this morning's report in my email:

Thu Aug 26 04:30:01 EDT 2010

--- /tmp/rforge-pre.txt 2010-08-26 04:30:33.0 -0400
+++ /tmp/rforge-post.txt2010-08-26 04:38:03.0 -0400
@@ -8,18 +8,18 @@
 AquaEnv_1.0-1.tar.gz   615059a5369d1aba149e6142fedffdde
 ArvoRe_0.1.6.tar.gzc955ae7c64c4270740172ad2219060ff
 BB_2010.7-1.tar.gz 4f85093ab24fac5c0b91539ec6efb8b7
-BCE_2.0.tar.gz 5a3fe3ecabbe2b2e278f6a48fc19d18d
-BIOMOD_1.1-5.tar.gzd2f74f21bc8858844f8d71627fd8e687
+BCE_2.0.tar.gz 65a968c586e729a1c1ca34a37f5c293a
+BIOMOD_1.1-5.tar.gz6929e5ad6a14709de7065286ec684942
 ...
-BTSPAS_2010.08.tar.gz  16b8f265846a512c329f0b52ba1924ab
+BTSPAS_2010.08.tar.gz  809a96b11f1094e95b217af113abd0ac
 ...
-BayesR_0.1-1.tar.gz72bd41c90845032eb9d15c4c6d086dec
+BayesFactorPCL_0.5.tar.gz  173ab741c399309314eff240a4c3cd6f
+BayesR_0.1-1.tar.gz9560b511f1b955a60529599672d58fea
 ...
-BiplotGUI_0.0-6.tar.gz 594b3a275cde018eaa74e1ef974dd522
+BiplotGUI_0.0-6.tar.gz 857a484fdba6cb97be4e42e38bb6d0fd
 ...
-IsoGene_1.0-18.tar.gz  679a5aecb7182474ed6a870fa52ca2e3
+IsoGene_1.0-18.tar.gz  f37572957b2a9846a8d738ec88ac8690

and so forth.  I've not taken the trime to understand why seemingly
new versions are appearing without version bumps yet.

Is anyone aware of explanations, other than a release process that
does not require unique versioning of differing content? [it seems
pretty basic to me that a 'receiver' of new content could do the
checks I do, and decline to push conflicting md5sums over an
identically named prior candidate in archive]

-- Russ herrold

__
r-h...@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide
http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.



--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Wiki entries on "package development process" and "software repository"

2010-09-08 Thread Spencer Graves
   I hereby invite anyone to make or suggest improvements to the 
Wikipedia entries on "package development process" and "software 
repository".



  Parts of these entries were created by Sundar Dorai-Raj and me:  
We believe that the procedures of the R community in these areas provide 
positive examples that could be profitably considered for people writing 
and sharing work in other languages.  I'm scheduled to speak about this 
next Tuesday to the San Francisco Bay Area chapter of the Association 
for Computing Machinery (ACM;  http://www.sfbayacm.org/?p=1962), but my 
interest in this extend beyond next Tuesday.  I may later send a note to 
the "Communications of the ACM" referencing these entries and inviting 
further input.



  Your help with this would be greatly appreciated, because I don't 
know enough to talk authoritatively about "package development process" 
and "software repository" for languages other than R.  If you know other 
people who might contribute perspectives for other languages, please 
feel free to forward this request to them.



  If you are a Wikipedian, feel free to change the entries 
directly.  Otherwise, I'd be pleased to hear your comments, suggested 
improvements, etc., via email.



  Thanks,
  Spencer Graves

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] How to connect R to Mysql?

2010-09-17 Thread Spencer Graves



  I've recently been through that with some success.  I don't 
remember all the details, but I first looked at "help(pac=RMySQL)".   
This told me that the maintainer was Jeffrey Horner.  Google told me he 
was at Vanderbilt.  Eventually I found 
"http://biostat.mc.vanderbilt.edu/wiki/Main/RMySQL";, which told me that 
I needed to build the package myself so it matches your version of 
MySQL, operating system, etc.  I did that.



  Does the MySQL database already exist?  I created a MySQL 
database and tables using MySQL server 5.1.50-win32.  (Which version of 
MySQL do you have?)



  help('RMySQL-package') includes "A typical usage".  That helped 
me get started, except that I needed to write to that database, not just 
query it.  For this, I think I got something like the following to work:



d <- dbReadTable(con, "WL")
dbWriteTable(con, "WL2", a.data.frame)  ## table from a data.frame
dbWriteTable(con, "test2", "~/data/test2.csv") ## table from a file


  Hope this helps.
  Spencer


On 9/17/2010 7:55 AM, Arijeet Mukherjee wrote:

I installed the RMySql package in R 2.11.1 64 bit
Now how can I connect R with MySql?
I am using a windows 7 64 bit version.
Please help ASAP.




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] How to set up an own package repository

2010-09-17 Thread Spencer Graves
 This is also discussed in "Creating R Packages, Using CRAN, R-Forge, 
And Local R Archive Networks And Subversion (SVN) Repositories ” by 
Spencer Graves and Sundar Dorai-Raj, available from CRAN -> contributed 
documentation 
"http://cran.fhcrc.org/doc/contrib/Graves+DoraiRaj-RPackageDevelopment.pdf";. 




Hope this helps. Spencer


On 9/17/2010 9:27 AM, Janko Thyson wrote:

Thank you very much for the advice!

Cheers,
Janko


-Ursprüngliche Nachricht-
Von: Friedrich Leisch [mailto:friedrich.lei...@stat.uni-muenchen.de]
Gesendet: Freitag, 17. September 2010 16:39
An: Janko Thyson
Cc: r-de...@r-project. org
Betreff: Re: [Rd] How to set up an own package repository


On Fri, 17 Sep 2010 12:16:47 +0200,
Janko Thyson (JT) wrote:

   >  Dear List,
   >  I'd like to set up a package repository so I can use
install.packages() on
   >  it for home-grown packages. I set up an AMPP infrastructure on a
windows box
   >  already, but I'm pretty lost with respect to what to do next as I
didn't do
   >  any web-programming/admin yet. Could anyone recommend some r-
specific
   >  tutorials or has a couple of suggestions for me? I've had a look at
the
   >  official R manual, but it just describes the required repository
structure,
   >  but not how to implement that. I'd also be willing to dive into SVN
and
   >  alikes if you think that's best practice.

If all machines involved can mount the repository as a network drive
you need no webserver at all, just use a file:/path/to/repository URL
for the repository.

If you want a full featured web frontend you may want to have a look
at the Bioconductor scripts for generating repositories:

http://bioconductor.org/packages/2.7/bioc/html/biocViews.html

and especially

http://bioconductor.org/packages/2.7/bioc/vignettes/biocViews/inst/doc/
createReposHtml.pdf

The scripts for CRAN are also in R but very specific for CRANs needs
...

Best,
Fritz

--
---
Prof. Dr. Friedrich Leisch

Institut für Statistik  Tel: (+49 89) 2180 3165
Ludwig-Maximilians-Universität  Fax: (+49 89) 2180 5308
Ludwigstraße 33
D-80539 München http://www.statistik.lmu.de/~leisch
---
Journal Computational Statistics --- http://www.springer.com/180
   Münchner R Kurse --- http://www.statistik.lmu.de/R


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] How to connect R to Mysql?

2010-09-17 Thread Spencer Graves

 Hi, Thomas:


  You use RODBC to connect to MySQL?


  Thanks, Spencer


On 9/17/2010 9:26 AM, Thomas Etheber wrote:

I also had problems connecting via RMysql on Windows several weeks ago.
I decided to skip the package and now use RODBC, which runs stable out 
of the box. Perhaps you should have a look at this package.


Hth
Thomas

Am 17.09.2010 17:50, schrieb Spencer Graves:



  I've recently been through that with some success.  I don't 
remember all the details, but I first looked at "help(pac=RMySQL)".   
This told me that the maintainer was Jeffrey Horner.  Google told me 
he was at Vanderbilt.  Eventually I found 
"http://biostat.mc.vanderbilt.edu/wiki/Main/RMySQL";, which told me 
that I needed to build the package myself so it matches your version 
of MySQL, operating system, etc.  I did that.



  Does the MySQL database already exist?  I created a MySQL 
database and tables using MySQL server 5.1.50-win32.  (Which version 
of MySQL do you have?)



  help('RMySQL-package') includes "A typical usage".  That helped 
me get started, except that I needed to write to that database, not 
just query it.  For this, I think I got something like the following 
to work:



d <- dbReadTable(con, "WL")
dbWriteTable(con, "WL2", a.data.frame)  ## table from a data.frame
dbWriteTable(con, "test2", "~/data/test2.csv") ## table from a file


  Hope this helps.
  Spencer


On 9/17/2010 7:55 AM, Arijeet Mukherjee wrote:

I installed the RMySql package in R 2.11.1 64 bit
Now how can I connect R with MySql?
I am using a windows 7 64 bit version.
Please help ASAP.






______
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R-Forge Downtime

2010-11-08 Thread Spencer Graves
Hi, Stefan:  Thanks for all your hard work to make R-Forge available for 
the rest of us.  Best Wishes, Spencer



On 11/8/2010 4:59 AM, Stefan Theussl wrote:

To all R-Forge developers/users:

Please note the short (<30min.) R-Forge downtime today at 18:00 CET.

Details:

We need to extend disk space of the system hosting the core R-Forge
components. This will lead to a short (<30min.) downtime of R-Forge
today at 18:00 CET. This is necessary due to the unreliability of
R-Forge especially on weekends, when log files have to be copied to a
backup location. Sometimes the system got unresponsive for a couple of
hours from sunday late evening to monday morning.

We are sorry for the inconvenience caused.

Best regards,
The R-Forge Administration and Development Team

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] how to store package options over sessions?

2010-11-20 Thread Spencer Graves
  If you want the information to be kept within a specific working 
directory, and forgotten when the user changes the working directory, 
then you can store the results in getwd() using something like the 
following:



save(object1toSave, object2toSave, ..., 
file="MyPackageName_saveFile.rda")



When the package is loaded, e.g., library(MyPackageName), you can use 
something like the following to restore it:



if(length(grep("MyPackageName_saveFile.rda", 
dir()))>0)load("MyPackageName_saveFile.rda")



  I haven't tried it, but it looks like it should work AND be 
platform independent (AND would not require SQLite).



  Dirk's solution seems better if you do NOT want the saved 
information to be lost when the user executes library(MyPackageName) 
within a different working directory.



  Hope this helps.
  Spencer


On 11/20/2010 9:51 AM, Dirk Eddelbuettel wrote:

On 20 November 2010 at 17:12, Mark Heckmann wrote:
| Hi,
| I posted this a week ago on r-help but did not get an answer. So I hope that 
someone here can help me:
| I want to define some options for my package the user may change.
| It would be convenient if the changes could be saved when terminating
| an R session and recovered automatically on the next package load.
|
| Is that possible and if yes, is the standard way to implement this?

First off:

R>  fortunes:::fortune("yoda")

Evelyn Hall: I would like to know how (if) I can extract some of the
information from the summary of my nlme.
Simon Blomberg: This is R. There is no if. Only how.
   -- Evelyn Hall and Simon 'Yoda' Blomberg
  R-help (April 2005)

R>

Secondly, what you ask is necessarily rather OS-dependent. So of course it
can be done, but probably in way that depend on your local circumstance.

Thirdly, and to make this a little more helpful, I frequently use the RSQLite
package to cache data across sessions and invocations. For a large example,
consider the CRANberries feed (http://dirk.eddelbuettel.com/cranberries/)
which stores in SQLite what the the state of the (CRAN) world was the last it
got woken up by crontab.  I also have a few smaller ad-hoc things at work
that do similar things.

Depending on your local circumstances you may make this 'cache' of stateful
information read or read/write, make it a file on NFS or CIFS or WebDAV, make
it a database that can be read as file or over sockets and so.

And Yoda still rules.

Dirk




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R Tools & Vista_x64: Problem compiling RMySQL?

2010-11-26 Thread Spencer Graves

Hello:


	  What do I need to do to compile R packages (especially RMySQL) for 
the 64-bit version of R 2.12.0 under Vista_x64?



	  I upgraded to "Rtools212.exe" yesterday but immediately got errors 
from "R CMD check RMySQL_0.7-5.tar.gz".  After the first error, I 
installed it a second time, then got "undefined reference to" roughly 50 
different names beginning "mysql_";  copied below.  I see two possible 
sources for this problem:



(1) RTools212 may not be installed properly.


		(2) RMySQL may be incompatible with R x64 2.12.0, especially with 
a 32-bit version of MySQL.



	  NOTE:  RMySQL worked with R2.11.1 (and MySQL 5.1.50-community server) 
before I installed R2.12.0.  I'm reasonably sure that my local 
installation of MySQL is only 32-bit.



  What do you suggest?  I use the 32-bit version of R2.12.0?


  Thanks,
  Spencer
#
* installing *source* package 'RMySQL' ...
checking for $MYSQL_HOME... C:/PROGRA~2/MySQL/MYSQLS~1.1/
cygwin warning:
  MS-DOS style path detected: C:/PROGRA~2/MySQL/MYSQLS~1.1/
  Preferred POSIX equivalent is: /cygdrive/c/PROGRA~2/MySQL/MYSQLS~1.1/
  CYGWIN environment variable option "nodosfilewarning" turns off this 
warning.

  Consult the user's guide for more details about POSIX paths:
http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
** libs
Warning: this package has a non-empty 'configure.win' file,
so building only the main architecture

cygwin warning:
  MS-DOS style path detected: C:/Users/sgraves/R/R-212~1.0/etc/x64/Makeconf
  Preferred POSIX equivalent is: 
/cygdrive/c/Users/sgraves/R/R-212~1.0/etc/x64/Makeconf
  CYGWIN environment variable option "nodosfilewarning" turns off this 
warning.

  Consult the user's guide for more details about POSIX paths:
http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
x86_64-w64-mingw32-gcc -I"C:/Users/sgraves/R/R-212~1.0/include" 
-I"C:/PROGRA~2/MySQL/MYSQLS~1.1/"/include-O2 -Wall  -std=gnu99 
-c RS-DBI.c -o RS-DBI.o
x86_64-w64-mingw32-gcc -I"C:/Users/sgraves/R/R-212~1.0/include" 
-I"C:/PROGRA~2/MySQL/MYSQLS~1.1/"/include-O2 -Wall  -std=gnu99 
-c RS-MySQL.c -o RS-MySQL.o
x86_64-w64-mingw32-gcc -shared -s -static-libgcc -o RMySQL.dll tmp.def 
RS-DBI.o RS-MySQL.o C:/PROGRA~2/MySQL/MYSQLS~1.1//bin/libmySQL.dll 
-LC:/Users/sgraves/R/R-212~1.0/bin/x64 -lR
RS-MySQL.o:RS-MySQL.c:(.text+0xb0): undefined reference to 
`mysql_more_results'

RS-MySQL.o:RS-MySQL.c:(.text+0x2c6): undefined reference to `mysql_init'
RS-MySQL.o:RS-MySQL.c:(.text+0x2d9): undefined reference to `mysql_options'
RS-MySQL.o:RS-MySQL.c:(.text+0x2ef): undefined reference to `mysql_options'
RS-MySQL.o:RS-MySQL.c:(.text+0x305): undefined reference to `mysql_options'
RS-MySQL.o:RS-MySQL.c:(.text+0x338): undefined reference to 
`mysql_real_connect'

RS-MySQL.o:RS-MySQL.c:(.text+0x399): undefined reference to `mysql_close'
RS-MySQL.o:RS-MySQL.c:(.text+0x3d1): undefined reference to `mysql_error'
RS-MySQL.o:RS-MySQL.c:(.text+0x7a2): undefined reference to `mysql_close'
RS-MySQL.o:RS-MySQL.c:(.text+0x80f): undefined reference to 
`mysql_fetch_fields'
RS-MySQL.o:RS-MySQL.c:(.text+0x823): undefined reference to 
`mysql_field_count'
RS-MySQL.o:RS-MySQL.c:(.text+0xae7): undefined reference to 
`mysql_next_result'
RS-MySQL.o:RS-MySQL.c:(.text+0xb0b): undefined reference to 
`mysql_use_result'
RS-MySQL.o:RS-MySQL.c:(.text+0xb16): undefined reference to 
`mysql_field_count'
RS-MySQL.o:RS-MySQL.c:(.text+0xbff): undefined reference to 
`mysql_affected_rows'
RS-MySQL.o:RS-MySQL.c:(.text+0xd27): undefined reference to 
`mysql_fetch_row'
RS-MySQL.o:RS-MySQL.c:(.text+0xd3d): undefined reference to 
`mysql_fetch_lengths'

RS-MySQL.o:RS-MySQL.c:(.text+0xf2e): undefined reference to `mysql_errno'
RS-MySQL.o:RS-MySQL.c:(.text+0x1093): undefined reference to `mysql_errno'
RS-MySQL.o:RS-MySQL.c:(.text+0x109e): undefined reference to `mysql_error'
RS-MySQL.o:RS-MySQL.c:(.text+0x1114): undefined reference to 
`mysql_fetch_row'
RS-MySQL.o:RS-MySQL.c:(.text+0x1121): undefined reference to 
`mysql_free_result'

RS-MySQL.o:RS-MySQL.c:(.text+0x11f0): undefined reference to `mysql_query'
RS-MySQL.o:RS-MySQL.c:(.text+0x1200): undefined reference to 
`mysql_use_result'
RS-MySQL.o:RS-MySQL.c:(.text+0x120b): undefined reference to 
`mysql_field_count'

RS-MySQL.o:RS-MySQL.c:(.text+0x128f): undefined reference to `mysql_query'
RS-MySQL.o:RS-MySQL.c:(.text+0x12ac): undefined reference to `mysql_error'
RS-MySQL.o:RS-MySQL.c:(.text+0x133c): undefined reference to 
`mysql_affected_rows'
RS-MySQL.o:RS-MySQL.c:(.text+0x1539): undefined reference to 
`mysql_get_client_info'
RS-MySQL.o:RS-MySQL.c:(.text+0x172d): undefined reference to 
`mysql_get_host_info'
RS-MySQL.o:RS-MySQL.c:(.text+0x174b): undefined reference to 
`mysql_get_server_info'
RS-MySQL.o:RS-MySQL.c:(.text+0x176d): undefined reference to 
`mysql_get_proto_info'
RS-MySQL.o:RS-MySQL.c:(.text+0x177c): undefined reference

Re: [Rd] R Tools & Vista_x64: Problem compiling RMySQL?

2010-11-26 Thread Spencer Graves

Thanks for the reminder not to cross post;  by now, I  should know better.


Thanks to Duncan for his reply, which helped convince me that I should 
compile for i386 only.  The problem seemed to disappear after I modified 
the path.



Spencer


On 11/26/2010 12:44 PM, Prof Brian Ripley wrote:
I've removed R-sig-db.  PLEASE don't cross-post, not least because the 
R-sig-db moderator (me) ends up having to approve all the 
non-subscribed replies such as Duncan's.


On Fri, 26 Nov 2010, Duncan Murdoch wrote:


On 26/11/2010 1:06 PM, Spencer Graves wrote:

Hello:


  What do I need to do to compile R packages (especially RMySQL) 
for

the 64-bit version of R 2.12.0 under Vista_x64?


The symptoms you're seeing are because the linker can't functions in

libmySQL.dll

which it is looking for in the somewhat strange path

C:/PROGRA~2/MySQL/MYSQLS~1.1//bin/libmySQL.dll


Does that file exist?  Is it a 64 bit dll, compatible with MinGW?  Is 
it compiled under the same convention as R, where no underscores are 
used in external names?  (The latter two questions can probably be 
answered by looking at "objdump -x libmySQL.dll".  objdump.exe is 
distributed as part of the MinGW distribution in Rtools.)


For x64, you need (or at least, it is more correct to use) 
x86_64-w64-mingw32-objdump.  But I think he has pretty much told us 
that it is a 32-bit DLL.  To install RMySQL on x64 Windows you need 
the 64-bit client DLLs.  The standard MySQL installers do not allow 
you to install 32-bit and 64-bit MySQL on the same machine, but once 
you manage that, 32bit RMySQL can talk to a 64-bit MySQL server, and v.v.


BTW, the underscore convention does not matter for DLLs, only object 
files (.o, .a).


This does all work (I passed patches back to the maintainer so that it 
does).  I do have a bi-arch Windows install of RMySQL on my machine 
talking to a 64-bit MySQL server.  But we've had far too much 
frustration with (even very minor) MySQL version mismatches to even 
think about distributing such a build.



Duncan Murdoch



I upgraded to "Rtools212.exe" yesterday but immediately got errors
from "R CMD check RMySQL_0.7-5.tar.gz".  After the first error, I
installed it a second time, then got "undefined reference to" 
roughly 50

different names beginning "mysql_";  copied below.  I see two possible
sources for this problem:


(1) RTools212 may not be installed properly.


(2) RMySQL may be incompatible with R x64 2.12.0, 
especially with

a 32-bit version of MySQL.


  NOTE:  RMySQL worked with R2.11.1 (and MySQL 5.1.50-community 
server)


But you failed to tell us the 'at a minimum' information we asked for 
about either version of R.  If that was i386 R, then yes, your MySQL 
is 32-bit.



before I installed R2.12.0.  I'm reasonably sure that my local
installation of MySQL is only 32-bit.


  What do you suggest?  I use the 32-bit version of R2.12.0?


  Thanks,
  Spencer
#
* installing *source* package 'RMySQL' ...
checking for $MYSQL_HOME... C:/PROGRA~2/MySQL/MYSQLS~1.1/
cygwin warning:
MS-DOS style path detected: C:/PROGRA~2/MySQL/MYSQLS~1.1/
Preferred POSIX equivalent is: 
/cygdrive/c/PROGRA~2/MySQL/MYSQLS~1.1/
CYGWIN environment variable option "nodosfilewarning" turns off 
this

warning.
Consult the user's guide for more details about POSIX paths:
  http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
** libs
Warning: this package has a non-empty 'configure.win' file,
so building only the main architecture

cygwin warning:
MS-DOS style path detected: 
C:/Users/sgraves/R/R-212~1.0/etc/x64/Makeconf

Preferred POSIX equivalent is:
/cygdrive/c/Users/sgraves/R/R-212~1.0/etc/x64/Makeconf
CYGWIN environment variable option "nodosfilewarning" turns off 
this

warning.
Consult the user's guide for more details about POSIX paths:
  http://cygwin.com/cygwin-ug-net/using.html#using-pathnames
x86_64-w64-mingw32-gcc -I"C:/Users/sgraves/R/R-212~1.0/include"
-I"C:/PROGRA~2/MySQL/MYSQLS~1.1/"/include-O2 -Wall  -std=gnu99
-c RS-DBI.c -o RS-DBI.o
x86_64-w64-mingw32-gcc -I"C:/Users/sgraves/R/R-212~1.0/include"
-I"C:/PROGRA~2/MySQL/MYSQLS~1.1/"/include-O2 -Wall  -std=gnu99
-c RS-MySQL.c -o RS-MySQL.o
x86_64-w64-mingw32-gcc -shared -s -static-libgcc -o RMySQL.dll tmp.def
RS-DBI.o RS-MySQL.o C:/PROGRA~2/MySQL/MYSQLS~1.1//bin/libmySQL.dll
-LC:/Users/sgraves/R/R-212~1.0/bin/x64 -lR
RS-MySQL.o:RS-MySQL.c:(.text+0xb0): undefined reference to
`mysql_more_results'
RS-MySQL.o:RS-MySQL.c:(.text+0x2c6): undefined reference to 
`mysql_init'
RS-MySQL.o:RS-MySQL.c:(.text+0x2d9): undefined reference to 
`mysql_options'
RS-MySQL.o:RS-MySQL.c:(.text+0x2ef): undefined reference to 
`mysql_options'

Re: [Rd] [Rcpp-devel] GPL and R Community Policies (Rcpp)

2010-12-01 Thread Spencer Graves

Hi, Dominick, et al.:


  I know nothing about about Rcpp, it's history and the 
contributions of Dominick and anyone else.  I think everyone should be 
appropriately recognized for their contributions.



  However, I feel compelled to briefly outline personal experiences 
with collaborators who were so concerned that their contribution be 
properly recognized that it limited our success.  To successfully 
commercialize the ideas, we needed the collaboration of others.  
However, my collaborators' excessive concern about getting "their share" 
made it exceedingly and unreasonably difficult to obtain the extra help 
we needed.



  A famous example of this was the Wright Brothers.  They  invented 
the airplane and spent much of the rest of their lives trying to defend 
their patent.  Wilbur was dead long before it was settled, and Orville 
got so little from it that it was clearly a massive waste of their 
time.  Moreover, "The legal threat suppressed development of the U.S. 
aviation industry." 
(http://en.wikipedia.org/wiki/The_Wright_brothers_patent_war)



  I sincerely hope that this present discussion can be settled in a 
way that does not damage the incredibly productive collaboration that 
has made R the overwhelming success it is.  The future of humanity is 
brighter because R makes it easier (a) for scientists to better 
understand the things they study and (b) for common people to better 
understand and manage the problems they face.



      Best Wishes,
  Spencer Graves


On 12/1/2010 4:20 PM, Dominick Samperi wrote:

On Wed, Dec 1, 2010 at 6:37 PM, Gabor Grothendieck
wrote:


On Wed, Dec 1, 2010 at 5:18 PM, Hadley Wickham  wrote:

Perhaps a wider community of R users can weigh in on a
policy decision that was implicitly deemed acceptable on this
thread. Namely, that it is fine to arbitrarily and
for no reason deprecate the contributions of past
authors, and as more progress is made, even more
disparaging remarks can be added.

What is disparaging about saying "a small portion of the code is based
on code written during 2005 and 2006 by Dominick Samperi"? I read this
as a factual statement saying that the current version of Rcpp is
based on, in a small way, your earlier work.

For reference, a disparaging comment would be something like: "This
package was based code written by Hadley Wickham that made my eyes
bleed", or "The development of this package was driven by the godawful
code that Hadley wrote".



Its very difficult to truly assess relative contributions when you mix
in design, coding, level of effort, promotion, etc.   I would not
focus on the single word "disparaging".  I think the poster simply
used the wrong word and perhaps what he meant was more along the lines
of: as the creator of the package he presumably set the design (or
significant elements of the design) for all subsequent work and in
that respect even if its true that the number of lines he generated is
relatively small compared to the current package, that phrase gives
the misleading impression that his contribution was also small.  There
is a difference between something that is true and non-misleading and
something that is true and misleading.


There is an important element of this discussion that is being overlooked,
namely, the timing. If indeed my contributions were minimal (and they
were not for the reasons you suggest) then why was it decided now,
for this particular release, to update my status? Why not the last
release? What changed? There were only a few new features added
to this release. What made the difference?

More importantly, as I suggested in my original post, this practice
sets an absurd precedent, one that motivated Stallman to write
the GNU manifesto (where he used the oxygen mask metaphor).
Should we reevaluate all contributors, present or past, and
adjust the level of deprecation on the
author line appropriately before each release?

I suspect that I have contributed far more than some of the
people listed on the author line. Does this mean that their
contributions should be discounted accordingly? If not,
why not?

Thanks for your courage. People who send supportive comments
tend to send them off-list, not wanting to state them publicly.

Dominick



--
Statistics&  Software Consulting
GKX Group, GKX Associates Inc.
tel: 1-877-GKX-GROUP
email: ggrothendieck at gmail.com


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [Rcpp-devel] GPL and R Community Policies (Rcpp)

2010-12-01 Thread Spencer Graves

Hi, Dominick, et al.:


  I know nothing about about Rcpp, it's history and the 
contributions of Dominick and anyone else.  I think everyone should be 
appropriately recognized for their contributions.



  However, I feel compelled to briefly outline personal experiences 
with collaborators who were so concerned that their contribution be 
properly recognized that it limited our success.  To successfully 
commercialize the ideas, we needed the collaboration of others.  
However, my collaborators' excessive concern about getting "their share" 
made it exceedingly and unreasonably difficult to obtain the extra help 
we needed.



  A famous example of this was the Wright Brothers.  They  invented 
the airplane and spent much of the rest of their lives trying to defend 
their patent.  Wilbur was dead long before it was settled, and Orville 
got so little from it that it was clearly a massive waste of their 
time.  Moreover, "The legal threat suppressed development of the U.S. 
aviation industry." 
(http://en.wikipedia.org/wiki/The_Wright_brothers_patent_war)



  I sincerely hope that this present discussion can be settled in a 
way that does not damage the incredibly productive collaboration that 
has made R the overwhelming success it is.  The future of humanity is 
brighter because R makes it easier (a) for scientists to better 
understand the things they study and (b) for common people to better 
understand and manage the problems they face.



      Best Wishes,
  Spencer Graves


On 12/1/2010 4:55 PM, Gabor Grothendieck wrote:

On Wed, Dec 1, 2010 at 7:20 PM, Dominick Samperi  wrote:

On Wed, Dec 1, 2010 at 6:37 PM, Gabor Grothendieck
wrote:

On Wed, Dec 1, 2010 at 5:18 PM, Hadley Wickham  wrote:

Perhaps a wider community of R users can weigh in on a
policy decision that was implicitly deemed acceptable on this
thread. Namely, that it is fine to arbitrarily and
for no reason deprecate the contributions of past
authors, and as more progress is made, even more
disparaging remarks can be added.

What is disparaging about saying "a small portion of the code is based
on code written during 2005 and 2006 by Dominick Samperi"? I read this
as a factual statement saying that the current version of Rcpp is
based on, in a small way, your earlier work.

For reference, a disparaging comment would be something like: "This
package was based code written by Hadley Wickham that made my eyes
bleed", or "The development of this package was driven by the godawful
code that Hadley wrote".



Its very difficult to truly assess relative contributions when you mix
in design, coding, level of effort, promotion, etc.   I would not
focus on the single word "disparaging".  I think the poster simply
used the wrong word and perhaps what he meant was more along the lines
of: as the creator of the package he presumably set the design (or
significant elements of the design) for all subsequent work and in
that respect even if its true that the number of lines he generated is
relatively small compared to the current package, that phrase gives
the misleading impression that his contribution was also small.  There
is a difference between something that is true and non-misleading and
something that is true and misleading.

There is an important element of this discussion that is being overlooked,
namely, the timing. If indeed my contributions were minimal (and they
were not for the reasons you suggest) then why was it decided now,
for this particular release, to update my status? Why not the last
release? What changed? There were only a few new features added
to this release. What made the difference?

More importantly, as I suggested in my original post, this practice
sets an absurd precedent, one that motivated Stallman to write
the GNU manifesto (where he used the oxygen mask metaphor).
Should we reevaluate all contributors, present or past, and
adjust the level of deprecation on the
author line appropriately before each release?

I suspect that I have contributed far more than some of the
people listed on the author line. Does this mean that their
contributions should be discounted accordingly? If not,
why not?

Thanks for your courage. People who send supportive comments
tend to send them off-list, not wanting to state them publicly.


Just to be clear I have never used the package and am not truly
commenting on this particular case but only the general ideas in this
thread.  Also I was not suggesting that the comments in the code were
purposefully misleading, only that they might be misleading since they
could be interpreted in terms of contribution even though they are
stated in terms of lines of code.  The author of the phrase may very
well have felt that the current team had done a lot of work to add
design ideas and develop and promote the software but perhaps the
unfortunate way in how it was expressed in that phrase that came out
as a 

Re: [Rd] GPL and R Community Policies (Rcpp)

2010-12-02 Thread Spencer Graves
y, the fact that the word "copyright" is profoundly misleading in
the context of GPL is not a new idea, and the word "copyleft" is
sometimes used instead. But copyleft is not used in source files
because this would unlink GPL from the well-established legal
framework associated with "copyright", making it more difficult for
the FSF to enforce its principles (the critical link is provided by
the copyright holders or "deputies").

A final clarification: authors of original works do retain a legal
copyright on  their original work in the sense that they are free
to modify this work and release it as non-free software (or
under a different free license), but this has no effect on the
version that was released under GPL. The latter version and
all of its progeny belong to the public (or to the FSF from
a legal point of view).

Please feel free to express your opinion on these matters.

Thanks,
Dominick

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


---
This message and its attachments are strictly confidenti...{{dropped:8}}

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel






--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] 0.5 != integrate(dnorm,0,20000) = 0

2010-12-06 Thread Spencer Graves

Hello:


  The example "integrate(dnorm,0,2)" says it "fails on many 
systems".  I just got 0 from it, when I should have gotten either an 
error or something close to 0.5.  I got this with R 2.12.0 under both 
Windows Vista_x64 and Linux (Fedora 13);  see the results from Windows 
below.  I thought you might want to know.



  Thanks for all your work in creating and maintaining R.


  Best Wishes,
  Spencer Graves
###

integrate(dnorm,0,2) ## fails on many systems
0 with absolute error < 0
> sessionInfo()
R version 2.12.0 (2010-10-15)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Suggested change to integrate.Rd (was: Re: 0.5 != integrate(dnorm, 0, 20000) = 0)

2010-12-07 Thread Spencer Graves
  What do you think about changing the verbiage with that example 
in "integrate.Rd" from "fails on many systems" to something like

"gives wrong answer without warning on many systems"?


  If I had write access to the core R code, I'd change this 
myself:  I'm probably not the only user who might think that saying 
something "fails" suggest it gives an error message.  Many contributions 
on this thread make it clear that it will never be possible to write an 
integrate function that won't give a "wrong answer without warning" in 
some cases.



  Thanks,
  Spencer


#
On 12/7/2010 7:02 AM, John Nolan wrote:

Putting in Inf for the upper bound does not work in general:
all 3 of the following should give 0.5


integrate( dnorm, 0, Inf )

0.5 with absolute error<  4.7e-05


integrate( dnorm, 0, Inf, sd=10 )

Error in integrate(dnorm, 0, Inf, sd = 1e+05) :
   the integral is probably divergent


integrate( dnorm, 0, Inf, sd=1000 )

5.570087e-05 with absolute error<  0.00010

Numerical quadrature methods look at a finite number of
points, and you can find examples that will confuse any
algorithm.  Rather than hope a general method will solve
all problems, users should look at their integrand and
pick an appropriate region of integration.

John Nolan, American U.


-r-devel-boun...@r-project.org wrote: -
To: r-devel@r-project.org
From: Pierre Chausse
Sent by: r-devel-boun...@r-project.org
Date: 12/07/2010 09:46AM
Subject: Re: [Rd] 0.5 != integrate(dnorm,0,2) = 0

   The warning about "absolute error == 0" would not be sufficient
because if you do
  >  integrate(dnorm, 0, 5000)
2.326323e-06 with absolute error<  4.6e-06

We get reasonable absolute error and wrong answer. For very high upper
bound, it seems more stable to use "Inf". In that case, another
.External is used which seems to be optimized for high or low bounds:

  >  integrate(dnorm, 0,Inf)
0.5 with absolute error<  4.7e-05


On 10-12-07 8:38 AM, John Nolan wrote:

I have wrestled with this problem before.  I think correcting
the warning to "absolute error ~<= 0" is a good idea, and printing
a warning if subdivisions==1 is also helpful.  Also, including
a simple example like the one that started this thread on the
help page for integrate might make the issue more clear to users.

But min.subdivisions is probably not.  On the example with dnorm( ),
I doubt 3 subdivisions would work.  The problem isn't that
we aren't sudividing enough, the problem is that the integrand
is 0 (in double precision) on most of the region and the
algorithm isn't designed to handle this.  There is no way to
determine how many subdivisions are necessary to get a reasonable
answer without a detailed analysis of the integrand.

I've gotten useful results with integrands that are monotonic on
the tail with a "self-triming integration" routine
like the following:


right.trimmed.integrate<- function( f, lower, upper, epsilon=1e-100, 
min.width=1e-10, ... ) {

+ # trim the region of integration on the right until f(x)>   epsilon
+
+ a<- lower; b<- upper
+ while ( (b-a>min.width)&&   (f(b)
right.trimmed.integrate( dnorm, 0, 2 )  # test

0.5 with absolute error<   9.2e-05

This can be adapted to left trim or (left and right) trim, abs(f(x)-c)>epsilon,
etc.  Setting the tolerances epsilon and min.width is an issue,
but an explicit discussion of these values could encourage people to
think about the problem in their specific case.  And of course, none
of this guarantees a correct answer, especially if someone tries this
on non-monotonic integrands with complicated 0 sets.  One could write
a somewhat more user-friendly version where the user has to specify
some property (or set of properties) of the integrand, e.g. "right-tail
decreasing to 0", etc. and have the algorithm try to do smart
trimming based on this.  But perhaps this getting too involved.

In the end, there is no general solution because any solution
depends on the specific nature of the integrand.  Clearer messages,
warnings in suspicious cases like subdivisions==1, and a simple
example explaining what the issue is in the help page would help
some people.

John

   ...

   John P. Nolan
   Math/Stat Department
   227 Gray Hall
   American University
   4400 Massachusetts Avenue, NW
   Washington, DC 20016-8050

   jpno...@american.edu
   202.885.3140 voice
   202.885.3155 fax
   http://academic2.american.edu/~jpnolan
   ...

-r-devel-boun...@r-project.org wrote: -
To: r-devel@r-project.org, Prof Brian Ripley
From: Martin Maechler
Sent by: r-devel-boun...@r-project.org
Date: 12/07/2010 03:29AM
Subject: Re:

Re: [Rd] Suggested change to integrate.Rd

2010-12-08 Thread Spencer Graves

Hi, John:


Maybe change it to something like "gives wrong answer without warning on 
many systems (see 'Note' above)", as the 'Note' does provide more detail.



Thanks,
Spencer


On 12/7/2010 8:08 PM, John Nolan wrote:

R developers understand intimately how things work, and terse
descriptions are sufficient.  However, most typical R users
would benefit from clearer documentation.  In multiple places
I've found the R documentation to be correct and understandable
AFTER I've figured a function out.

And to be fair, this problem with integrate( ) isn't really R's
fault: the QUADPACK routines that R uses are very good algorithms,
but neither they nor any other package can handle all cases.

I would support reasonable changes in the documentation for
integrate( ).   Just saying it "gives wrong answer without
warning on many systems" seems misleading (it works fine in
many cases) and it doesn't help a user understand how to use
integrate( ) correctly/carefully.  IMO a simple example like
this one w/ dnorm would catch peoples attention and a couple
lines of explanation/warning would then make more sense.

John Nolan, American U


-Spencer Graves  wrote: -
To: John Nolan
From: Spencer Graves
Date: 12/07/2010 07:58PM
Cc: pchau...@uwaterloo.ca, r-devel@r-project.org
Subject: Suggested change to integrate.Rd (was: Re: [Rd] 0.5 != 
integrate(dnorm,0,2) = 0)

What do you think about changing the verbiage with that example
in "integrate.Rd" from "fails on many systems" to something like
"gives wrong answer without warning on many systems"?


If I had write access to the core R code, I'd change this
myself:  I'm probably not the only user who might think that saying
something "fails" suggest it gives an error message.  Many contributions
on this thread make it clear that it will never be possible to write an
integrate function that won't give a "wrong answer without warning" in
some cases.


Thanks,
Spencer


#
On 12/7/2010 7:02 AM, John Nolan wrote:

Putting in Inf for the upper bound does not work in general:
all 3 of the following should give 0.5


integrate( dnorm, 0, Inf )

0.5 with absolute error<   4.7e-05


integrate( dnorm, 0, Inf, sd=10 )

Error in integrate(dnorm, 0, Inf, sd = 1e+05) :
the integral is probably divergent


integrate( dnorm, 0, Inf, sd=1000 )

5.570087e-05 with absolute error<   0.00010

Numerical quadrature methods look at a finite number of
points, and you can find examples that will confuse any
algorithm.  Rather than hope a general method will solve
all problems, users should look at their integrand and
pick an appropriate region of integration.

John Nolan, American U.


-r-devel-boun...@r-project.org wrote: -
To: r-devel@r-project.org
From: Pierre Chausse
Sent by: r-devel-boun...@r-project.org
Date: 12/07/2010 09:46AM
Subject: Re: [Rd] 0.5 != integrate(dnorm,0,2) = 0

The warning about "absolute error == 0" would not be sufficient
because if you do
   >   integrate(dnorm, 0, 5000)
2.326323e-06 with absolute error<   4.6e-06

We get reasonable absolute error and wrong answer. For very high upper
bound, it seems more stable to use "Inf". In that case, another
.External is used which seems to be optimized for high or low bounds:

   >   integrate(dnorm, 0,Inf)
0.5 with absolute error<   4.7e-05


On 10-12-07 8:38 AM, John Nolan wrote:

I have wrestled with this problem before.  I think correcting
the warning to "absolute error ~<= 0" is a good idea, and printing
a warning if subdivisions==1 is also helpful.  Also, including
a simple example like the one that started this thread on the
help page for integrate might make the issue more clear to users.

But min.subdivisions is probably not.  On the example with dnorm( ),
I doubt 3 subdivisions would work.  The problem isn't that
we aren't sudividing enough, the problem is that the integrand
is 0 (in double precision) on most of the region and the
algorithm isn't designed to handle this.  There is no way to
determine how many subdivisions are necessary to get a reasonable
answer without a detailed analysis of the integrand.

I've gotten useful results with integrands that are monotonic on
the tail with a "self-triming integration" routine
like the following:


right.trimmed.integrate<- function( f, lower, upper, epsilon=1e-100, 
min.width=1e-10, ... ) {

+ # trim the region of integration on the right until f(x)>epsilon
+
+ a<- lower; b<- upper
+ while ( (b-a>min.width)&&(f(b)
right.trimmed.integrate( dnorm, 0, 2 )  # test

0.5 with absolute error<9.2e-05

This can be adapted to left trim or (left and right) trim, abs(f(x)-c)>epsilon,
etc.  Setting the tolerances epsilon and min.wid

Re: [Rd] Suggested change to integrate.Rd

2010-12-08 Thread Spencer Graves
That sounds like a great idea to me:  This should give the Core R team 
more time to worry about the code by delegating maintenance of the help 
files to a larger group.



Spencer


On 12/8/2010 2:22 PM, John Nolan wrote:

Well, you can't idiot-proof things, but you can give clear descriptions and
warnings.
To take things to the extreme, one can eliminate all help files.  If a user
really wants
to understand things, they can read the source code, right?

This is a general question for r-dev: who are the help files aimed at? If
the
answer is experts only, then don't put any more effort into help files.
But if you
want more users to be able to do more things, then more explanation will
help.

Perhaps there should be a "documentation team" (r-doc?) that intersects
r-dev, but
focuses on documentation?

John,  American U




From:   "Ravi Varadhan"
To: "'John Nolan'",
 
Cc: 
Date:   12/08/2010 10:43 AM
Subject:RE: [Rd] Suggested change to integrate.Rd (was: Re: 0.5 !=
 integrate(dnorm, 0, 2) = 0)



Hi,

My honest and (not so) humble opinion is that no amount of clear and
explicit warning can totally prevent the inappropriate use of any tool.
Users will continue to use the tools, without doing the necessary
background
work to figure out whether the that tool is the appropriate one for their
particular problem.  If things can go so horribly wrong in such a simple
case, imagine all the snares and traps present in complex, high-dimensional
integration.  Even the best cubature rules or the MCMC methods can give
wrong results.  Even worse, how in heaven's name can we be sure that the
answer is any good?  The simple and best solution is to understand your
integrand as best as you can.  I realize that this may be viewed as being
too pedantic, but unfortunately, it is also the best advice.

Best,
Ravi.
---
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns
Hopkins University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu


-Original Message-
From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
On Behalf Of John Nolan
Sent: Tuesday, December 07, 2010 11:09 PM
To: spencer.gra...@structuremonitoring.com
Cc: r-devel@r-project.org
Subject: Re: [Rd] Suggested change to integrate.Rd (was: Re: 0.5 !=
integrate(dnorm, 0, 2) = 0)

R developers understand intimately how things work, and terse
descriptions are sufficient.  However, most typical R users
would benefit from clearer documentation.  In multiple places
I've found the R documentation to be correct and understandable
AFTER I've figured a function out.

And to be fair, this problem with integrate( ) isn't really R's
fault: the QUADPACK routines that R uses are very good algorithms,
but neither they nor any other package can handle all cases.

I would support reasonable changes in the documentation for
integrate( ).   Just saying it "gives wrong answer without
warning on many systems" seems misleading (it works fine in
many cases) and it doesn't help a user understand how to use
integrate( ) correctly/carefully.  IMO a simple example like
this one w/ dnorm would catch peoples attention and a couple
lines of explanation/warning would then make more sense.

John Nolan, American U


-Spencer Graves  wrote: -
To: John Nolan
From: Spencer Graves
Date: 12/07/2010 07:58PM
Cc: pchau...@uwaterloo.ca, r-devel@r-project.org
Subject: Suggested change to integrate.Rd (was: Re: [Rd] 0.5 !=
integrate(dnorm,0,2) = 0)

What do you think about changing the verbiage with that example
in "integrate.Rd" from "fails on many systems" to something like
"gives wrong answer without warning on many systems"?


If I had write access to the core R code, I'd change this
myself:  I'm probably not the only user who might think that saying
something "fails" suggest it gives an error message.  Many contributions
on this thread make it clear that it will never be possible to write an
integrate function that won't give a "wrong answer without warning" in
some cases.


Thanks,
Spencer


#
On 12/7/2010 7:02 AM, John Nolan wrote:

Putting in Inf for the upper bound does not work in general:
all 3 of the following should give 0.5


integrate( dnorm, 0, Inf )

0.5 with absolute error<   4.7e-05


integrate( dnorm, 0, Inf, sd=10 )

Error in integrate(dnorm, 0, Inf, sd = 1e+05) :
the integral is probably divergent


integrate( dnorm, 0, Inf, sd=1000 )

5.570087e-05 with absolute error<   0.00010

Numerical quadrature methods look at a finite number of
points, and you can find examples that will confuse any
algorithm.  Rather than hope a general method will solve
all problems, users sho

Re: [Rd] R vs. C

2011-01-17 Thread Spencer Graves
  Another point I have not yet seen mentioned:  If your code is 
painfully slow, that can often be fixed without leaving R by 
experimenting with different ways of doing the same thing -- often after 
using profiling your code to find the slowest part as described in 
chapter 3 of "Writing R Extensions".



  If I'm given code already written in C (or some other language), 
unless it's really simple, I may link to it rather than recode it in R.  
However, the problems with portability, maintainability, transparency to 
others who may not be very facile with C, etc., all suggest that it's 
well worth some effort experimenting with alternate ways of doing the 
same thing in R before jumping to C or something else.



  Hope this helps.
  Spencer


On 1/17/2011 10:57 AM, David Henderson wrote:

I think we're also forgetting something, namely testing.  If you write your
routine in C, you have placed additional burden upon yourself to test your C
code through unit tests, etc.  If you write your code in R, you still need the
unit tests, but you can rely on the well tested nature of R to allow you to
reduce the number of tests of your algorithm.  I routinely tell people at Sage
Bionetworks where I am working now that your new C code needs to experience at
least one order of magnitude increase in performance to warrant the effort of
moving from R to C.

But, then again, I am working with scientists who are not primarily, or even
secondarily, coders...

Dave H



- Original Message 
From: Dirk Eddelbuettel
To: Patrick Leyshock
Cc: r-devel@r-project.org
Sent: Mon, January 17, 2011 10:13:36 AM
Subject: Re: [Rd] R vs. C


On 17 January 2011 at 09:13, Patrick Leyshock wrote:
| A question, please about development of R packages:
|
| Are there any guidelines or best practices for deciding when and why to
| implement an operation in R, vs. implementing it in C?  The "Writing R
| Extensions" recommends "working in interpreted R code . . . this is normally
| the best option."  But we do write C-functions and access them in R - the
| question is, when/why is this justified, and when/why is it NOT justified?
|
| While I have identified helpful documents on R coding standards, I have not
| seen notes/discussions on when/why to implement in R, vs. when to implement
| in C.

The (still fairly recent) book 'Software for Data Analysis: Programming with
R' by John Chambers (Springer, 2008) has a lot to say about this.  John also
gave a talk in November which stressed 'multilanguage' approaches; see e.g.
http://blog.revolutionanalytics.com/2010/11/john-chambers-on-r-and-multilingualism.html


In short, it all depends, and it is unlikely that you will get a coherent
answer that is valid for all circumstances.  We all love R for how expressive
and powerful it is, yet there are times when something else is called for.
Exactly when that time is depends on a great many things and you have not
mentioned a single metric in your question.  So I'd start with John's book.

Hope this helps, Dirk


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R vs. C

2011-01-17 Thread Spencer Graves
  For me, a major strength of R is the package development 
process.  I've found this so valuable that I created a Wikipedia entry 
by that name and made additions to a Wikipedia entry on "software 
repository", noting that this process encourages good software 
development practices that I have not seen standardized for other 
languages.  I encourage people to review this material and make 
additions or corrections as they like (or sent me suggestions for me to 
make appropriate changes).



  While R has other capabilities for unit and regression testing, I 
often include unit tests in the "examples" section of documentation 
files.  To keep from cluttering the examples with unnecessary material, 
I often include something like the following:



A1 <- myfunc() # to test myfunc

A0 <- ("manual generation of the correct  answer for A1")

\dontshow{stopifnot(} # so the user doesn't see "stopifnot("
all.equal(A1, A0) # compare myfunc output with the correct answer
\dontshow{)} # close paren on "stopifnot(".


  This may not be as good in some ways as a full suite of unit 
tests, which could be provided separately.  However, this has the 
distinct advantage of including unit tests with the documentation in a 
way that should help users understand "myfunc".  (Unit tests too 
detailed to show users could be completely enclosed in "\dontshow".



  Spencer


On 1/17/2011 11:38 AM, Dominick Samperi wrote:

On Mon, Jan 17, 2011 at 2:08 PM, Spencer Graves<
spencer.gra...@structuremonitoring.com>  wrote:


  Another point I have not yet seen mentioned:  If your code is
painfully slow, that can often be fixed without leaving R by experimenting
with different ways of doing the same thing -- often after using profiling
your code to find the slowest part as described in chapter 3 of "Writing R
Extensions".


  If I'm given code already written in C (or some other language),
unless it's really simple, I may link to it rather than recode it in R.
  However, the problems with portability, maintainability, transparency to
others who may not be very facile with C, etc., all suggest that it's well
worth some effort experimenting with alternate ways of doing the same thing
in R before jumping to C or something else.

  Hope this helps.
  Spencer



On 1/17/2011 10:57 AM, David Henderson wrote:


I think we're also forgetting something, namely testing.  If you write
your
routine in C, you have placed additional burden upon yourself to test your
C
code through unit tests, etc.  If you write your code in R, you still need
the
unit tests, but you can rely on the well tested nature of R to allow you
to
reduce the number of tests of your algorithm.  I routinely tell people at
Sage
Bionetworks where I am working now that your new C code needs to
experience at
least one order of magnitude increase in performance to warrant the effort
of
moving from R to C.

But, then again, I am working with scientists who are not primarily, or
even
secondarily, coders...

Dave H



This makes sense, but I have seem some very transparent algorithms turned
into vectorized R code
that is difficult to read (and thus to maintain or to change). These chunks
of optimized R code are like
embedded assembly, in the sense that nobody is likely to want to mess with
it. This could be addressed
by including pseudo code for the original (more transparent) algorithm as a
comment, but I have never
seen this done in practice (perhaps it could be enforced by R CMD check?!).

On the other hand, in principle a well-documented piece of C/C++ code could
be much easier to understand,
without paying a performance penalty...but "coders" are not likely to place
this high on their
list of priorities.

The bottom like is that R is an adaptor ("glue") language like Lisp that
makes it easy to mix and
match functions (using classes and generic functions), many of which are
written in C (or C++
or Fortran) for performance reasons. Like any object-based system there can
be a lot of
object copying, and like any functional programming system, there can be a
lot of function
calls, resulting in poor performance for some applications.

If you can vectorize your R code then you have effectively found a way to
benefit from
somebody else's C code, thus saving yourself some time. For operations other
than pure
vector calculations you will have to do the C/C++ programming yourself (or
call a library
that somebody else has written).

Dominick




- Original Message 
From: Dirk Eddelbuettel
To: Patrick Leyshock
Cc: r-devel@r-project.org
Sent: Mon, January 17, 2011 10:13:36 AM
Subject: Re: [Rd] R vs. C


On 17 January 2011 at 09:13, Patrick Leyshock wrote:
| A question, please about development of R packages:
|
| Are there any guidelines or best practices for deciding when and why to
| implement an operati

Re: [Rd] R vs. C

2011-01-17 Thread Spencer Graves

Hi, Paul:


  The "Writing R Extensions" manual says that *.R code in a "tests" 
directory is run during "R CMD check".  I suspect that many R 
programmers do this routinely.  I probably should do that also.  
However, for me, it's simpler to have everything in the "examples" 
section of *.Rd files.  I think the examples with independently 
developed answers provides useful documentation.



  Spencer


On 1/17/2011 1:52 PM, Paul Gilbert wrote:

Spencer

Would it not be easier to include this kind of test in a small file in the 
tests/ directory?

Paul

-Original Message-
From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org] On 
Behalf Of Spencer Graves
Sent: January 17, 2011 3:58 PM
To: Dominick Samperi
Cc: Patrick Leyshock; r-devel@r-project.org; Dirk Eddelbuettel
Subject: Re: [Rd] R vs. C


For me, a major strength of R is the package development
process.  I've found this so valuable that I created a Wikipedia entry
by that name and made additions to a Wikipedia entry on "software
repository", noting that this process encourages good software
development practices that I have not seen standardized for other
languages.  I encourage people to review this material and make
additions or corrections as they like (or sent me suggestions for me to
make appropriate changes).


While R has other capabilities for unit and regression testing, I
often include unit tests in the "examples" section of documentation
files.  To keep from cluttering the examples with unnecessary material,
I often include something like the following:


A1<- myfunc() # to test myfunc

A0<- ("manual generation of the correct  answer for A1")

\dontshow{stopifnot(} # so the user doesn't see "stopifnot("
all.equal(A1, A0) # compare myfunc output with the correct answer
\dontshow{)} # close paren on "stopifnot(".


This may not be as good in some ways as a full suite of unit
tests, which could be provided separately.  However, this has the
distinct advantage of including unit tests with the documentation in a
way that should help users understand "myfunc".  (Unit tests too
detailed to show users could be completely enclosed in "\dontshow".


Spencer


On 1/17/2011 11:38 AM, Dominick Samperi wrote:

On Mon, Jan 17, 2011 at 2:08 PM, Spencer Graves<
spencer.gra...@structuremonitoring.com>   wrote:


   Another point I have not yet seen mentioned:  If your code is
painfully slow, that can often be fixed without leaving R by experimenting
with different ways of doing the same thing -- often after using profiling
your code to find the slowest part as described in chapter 3 of "Writing R
Extensions".


   If I'm given code already written in C (or some other language),
unless it's really simple, I may link to it rather than recode it in R.
   However, the problems with portability, maintainability, transparency to
others who may not be very facile with C, etc., all suggest that it's well
worth some effort experimenting with alternate ways of doing the same thing
in R before jumping to C or something else.

   Hope this helps.
   Spencer



On 1/17/2011 10:57 AM, David Henderson wrote:


I think we're also forgetting something, namely testing.  If you write
your
routine in C, you have placed additional burden upon yourself to test your
C
code through unit tests, etc.  If you write your code in R, you still need
the
unit tests, but you can rely on the well tested nature of R to allow you
to
reduce the number of tests of your algorithm.  I routinely tell people at
Sage
Bionetworks where I am working now that your new C code needs to
experience at
least one order of magnitude increase in performance to warrant the effort
of
moving from R to C.

But, then again, I am working with scientists who are not primarily, or
even
secondarily, coders...

Dave H



This makes sense, but I have seem some very transparent algorithms turned
into vectorized R code
that is difficult to read (and thus to maintain or to change). These chunks
of optimized R code are like
embedded assembly, in the sense that nobody is likely to want to mess with
it. This could be addressed
by including pseudo code for the original (more transparent) algorithm as a
comment, but I have never
seen this done in practice (perhaps it could be enforced by R CMD check?!).

On the other hand, in principle a well-documented piece of C/C++ code could
be much easier to understand,
without paying a performance penalty...but "coders" are not likely to place
this high on their
list of priorities.

The bottom like is that R is an adaptor ("glue") language like Lisp that
makes it easy to mix and
match functions (using classes and generic functions), many of which are
written in C (or C++
or Fortran) for performance reasons. Like any obje

Re: [Rd] R vs. C

2011-01-17 Thread Spencer Graves

Hi, Dominick, et al.:


  Demanding complete unit test suites with all software contributed 
to CRAN would likely cut contributions by a factor of 10 or 100.  For 
me, the R package creation process is close to perfection in providing a 
standard process for documentation with places for examples and test 
suites of various kinds.  I mention "perfection", because it makes 
developing "trustworthy software" (Chamber's "prime directive") 
relatively easy without forcing people to do things they don't feel 
comfortable doing.



  If you need more confidence in the software you use, you can 
build your own test suites -- maybe in packages you write yourself -- or 
pay someone else to develop test suites to your specifications.  For 
example, Revolution Analytics offers "Package validation, development 
and support".



   Spencer


On 1/17/2011 3:27 PM, Dominick Samperi wrote:

On Mon, Jan 17, 2011 at 5:15 PM, Spencer Graves<
spencer.gra...@structuremonitoring.com>  wrote:


Hi, Paul:


  The "Writing R Extensions" manual says that *.R code in a "tests"
directory is run during "R CMD check".  I suspect that many R programmers do
this routinely.  I probably should do that also.  However, for me, it's
simpler to have everything in the "examples" section of *.Rd files.  I think
the examples with independently developed answers provides useful
documentation.


This is a unit test function, and I think it would be better if there was a
way to unit test packages *before* they
are released to CRAN. Otherwise, this is not really a "release," it is test
or "beta" version. This is currently
possible under Windows using http://win-builder.r-project.org/, for example.

My earlier remark about the release process was more about documentation
than about unit testing, more
about the gentle "nudging" that the R release process does to help insure
consistent documentation and
organization, and about how this nudging might be extended to the C/C++ part
of a package.

Dominick



  Spencer



On 1/17/2011 1:52 PM, Paul Gilbert wrote:


Spencer

Would it not be easier to include this kind of test in a small file in the
tests/ directory?

Paul

-Original Message-
From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
On Behalf Of Spencer Graves
Sent: January 17, 2011 3:58 PM
To: Dominick Samperi
Cc: Patrick Leyshock; r-devel@r-project.org; Dirk Eddelbuettel
Subject: Re: [Rd] R vs. C


For me, a major strength of R is the package development
process.  I've found this so valuable that I created a Wikipedia entry
by that name and made additions to a Wikipedia entry on "software
repository", noting that this process encourages good software
development practices that I have not seen standardized for other
languages.  I encourage people to review this material and make
additions or corrections as they like (or sent me suggestions for me to
make appropriate changes).


While R has other capabilities for unit and regression testing, I
often include unit tests in the "examples" section of documentation
files.  To keep from cluttering the examples with unnecessary material,
I often include something like the following:


A1<- myfunc() # to test myfunc

A0<- ("manual generation of the correct  answer for A1")

\dontshow{stopifnot(} # so the user doesn't see "stopifnot("
all.equal(A1, A0) # compare myfunc output with the correct answer
\dontshow{)} # close paren on "stopifnot(".


This may not be as good in some ways as a full suite of unit
tests, which could be provided separately.  However, this has the
distinct advantage of including unit tests with the documentation in a
way that should help users understand "myfunc".  (Unit tests too
detailed to show users could be completely enclosed in "\dontshow".


Spencer


On 1/17/2011 11:38 AM, Dominick Samperi wrote:


On Mon, Jan 17, 2011 at 2:08 PM, Spencer Graves<
spencer.gra...@structuremonitoring.com>wrote:

Another point I have not yet seen mentioned:  If your code is

painfully slow, that can often be fixed without leaving R by
experimenting
with different ways of doing the same thing -- often after using
profiling
your code to find the slowest part as described in chapter 3 of "Writing
R
Extensions".


   If I'm given code already written in C (or some other language),
unless it's really simple, I may link to it rather than recode it in R.
   However, the problems with portability, maintainability, transparency
to
others who may not be very facile with C, etc., all suggest that it's
well
worth some effort experimenting with alternate ways of doing the same
thing
in R before jumping to C or something else.

   Hope this helps.
   Spencer


Re: [Rd] R vs. C now rather: how to ease package checking

2011-01-18 Thread Spencer Graves

On 1/18/2011 8:44 AM, Dominick Samperi wrote:

On Tue, Jan 18, 2011 at 4:48 AM, Claudia Beleiteswrote:


On 01/18/2011 01:13 AM, Dominick Samperi wrote:


On Mon, Jan 17, 2011 at 7:00 PM, Spencer Graves<
spencer.gra...@structuremonitoring.com>   wrote:

  Hi, Dominick, et al.:


  Demanding complete unit test suites with all software contributed to
CRAN would likely cut contributions by a factor of 10 or 100.  For me,
the R
package creation process is close to perfection in providing a standard
process for documentation with places for examples and test suites of
various kinds.  I mention "perfection", because it makes developing
"trustworthy software" (Chamber's "prime directive") relatively easy
without
forcing people to do things they don't feel comfortable doing.



I don't think I made myself clear, sorry. I was not suggesting that
package
developers include a complete unit
test suite. I was suggesting that unit testing should be done outside of
the
CRAN release process. Packages
should be submitted for "release" to CRAN after they have been tested (the
responsibility of the package
developers). I understand that the main problem here is that package
developers do not have access to
all supported platforms, so the current process is not likely to change.


Regarding access to all platforms: But there's r-forge where building and
checks are done nightly for Linux, Win, and Mac (though for some months now
the check protocols are not available for 32 bit Linux and Windows - but I
hope they'll be back soon).
I found it extremely easy to get an account&  project space and building.
Many thanks to r-forge!


Good point Claudia,

There are packages released to CRAN that
do not build on some platforms because the unit tests fail. It seems to me
that this kind of issue could be ironed out with the help of r-forge before
release, in which case there is no need to run the unit tests for released
packages.

Dominick


CRAN also runs "R CMD check" on its contributed packages.  I've found 
problems (and fixed) that I couldn't replicate by reviewing the repeated 
checks on both R-Forge and CRAN.



Spencer


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "+" operator on characters revisited

2011-01-23 Thread Spencer Graves

Consider the following from Python 2.6.5:


>>> 'abc'+ 2

Traceback (most recent call last):
  File "", line 1, in 
'abc'+ 2
TypeError: cannot concatenate 'str' and 'int' objects
>>> 'abc'+'2'
'abc2'
>>>


  Spencer


On 1/23/2011 8:09 AM, Hadley Wickham wrote:

Yet another useful suggestion of introducing cat0() and paste0(), for
the common use of cat and paste with sep="" was not absorbed by the
core R either.

stringr has str_c which is a replacement for paste with sep = "" and
automatic removal of length 0 inputs.

Hadley





--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "+" operator on characters revisited

2011-01-23 Thread Spencer Graves

On 1/23/2011 8:50 AM, peter dalgaard wrote:

On Jan 22, 2011, at 21:08 , Vitalie S. wrote:


The only definite argument occurred in the thread against "+" operator
was the lack of commutativity (as if one have to prove algebraic
theorems in R).

I think the real killer was associativity, combined with coercion rules:

Is "x"+1+2 supposed to be equal to "x12" or "x3"?

  Excellent:  This seems like a good reason to follow Python:  
Allow "a+b" with a character vector "a" only if "b" is also a character 
vector (or factor?).



  This example raises another question:  If we allow "a+b" for "a" 
and "b" both character vectors (and give an error if one is numeric), 
what do we do with factors?  If "a" is a factor, return a factor?



  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] "+" operator on characters revisited

2011-01-23 Thread Spencer Graves



On 1/23/2011 12:15 PM, Vitalie S. wrote:

Spencer Graves  writes:


On 1/23/2011 8:50 AM, peter dalgaard wrote:

On Jan 22, 2011, at 21:08 , Vitalie S. wrote:


The only definite argument occurred in the thread against "+" operator
was the lack of commutativity (as if one have to prove algebraic
theorems in R).

I think the real killer was associativity, combined with coercion rules:

Is "x"+1+2 supposed to be equal to "x12" or "x3"?


   Excellent:  This seems like a good reason to follow Python:  Allow "a+b" with a 
character vector "a" only if
"b" is also a character vector (or factor?).

   This example raises another question:  If we allow "a+b" for "a" and "b" 
both character vectors (and give an
error if one is numeric), what do we do with factors?  If "a" is a factor,
return a factor?

If we define custom %+% as:

 `%+%`<- function(a, b){
 if(is.character(a) || is.character(b))
 paste(as.character(a), as.character(b), sep="")
 else
 a + b
 }

because of higher precedence of %any% operators over binary + we have:

 "a" %+% 1 %+% 2
 ## [1] "a12"

and

str("a" %+% factor(1:2))
## chr [1:2] "a1" "a2"

so if + on characters would behave "as if" having slightly higher priority than
other + operators that might solve reasonably the problem.

Vitalie.


No:  'a' %+% (1 %+%2)  != ('a' %+% 1) %+% 2, as Peter Dalgaard noted:  
'a3' != 'a12'.




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Reading a specific file during "R CMD check"?

2011-02-04 Thread Spencer Graves

Hello, All:


  How can I obtain the location of an example data file in a 
package during "R CMD check"?



  I want to include sample raw data files in a package and have 
them read by a function in the package.  It occurs to me to put such a 
file in "\inst\rawdata" and have examples find the data using something 
like "system.file('rawdata', package='MyPackage')". However, this will 
only work if the desired data are already in a version of 'MyPackage' 
that is already installed.  If I change the data, this will return the 
old data, not the modified.  I've looked at packages RUnit and svUnit, 
but have not spent enough time with either to know if they include a 
solution to this problem.



  Thanks for your help.
  Spencer

--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Reading a specific file during "R CMD check"?

2011-02-05 Thread Spencer Graves
  Thanks very much to Hadley and Prof. Ripley for their replies -- 
and my apologies to the list for failing to check my facts before I posted:



  Prof. Ripley's reply was correct:  system.file('rawdata', 
package='testPackage') in a help page in package 'tstPackage' with 
subdirectory 'inst/rawdata' returned 
"~/Rpkgs/tstPackage.Rcheck/tstPackage/rawdata".



  I was once again misled by something I knew that wasn't so.


  Best Wishes,
  Spencer


On 2/5/2011 12:04 AM, Prof Brian Ripley wrote:

On Fri, 4 Feb 2011, Hadley Wickham wrote:


Hi Spencer,

I think one of the early phases of R CMD check is R CMD install - it
installs the package into a special location so that it doesn't
override existing installed packages, but still allows function to
work exactly as if they were in an installed package.


There first part may or may not be true (it is not for some of the 
ways 'check' is used on the check farm: consider the --install 
argument, which is not described in --help), but in every case it will 
be true that the copy of the package under check is installed in the 
library at .libPaths()[1], and that installed package is used by 
'check' unless the user's code does really odd things (like manipulate 
.libPaths()).




Hadley

On Fri, Feb 4, 2011 at 7:57 PM, Spencer Graves
 wrote:

Hello, All:


 How can I obtain the location of an example data file in a package
during "R CMD check"?


 I want to include sample raw data files in a package and have 
them read

by a function in the package.  It occurs to me to put such a file in
"\inst\rawdata" and have examples find the data using something like
"system.file('rawdata', package='MyPackage')". However, this will 
only work
if the desired data are already in a version of 'MyPackage' that is 
already
installed.  If I change the data, this will return the old data, not 
the

modified.  I've looked at packages RUnit and svUnit, but have not spent
enough time with either to know if they include a solution to this 
problem.


I doubt that the 'problem' is in 'R CMD check', but without a 
reproducible example, we have no evidence either way.  Certainly 
plenty of packages use system.file() in their examples, and 'check' 
works for them even if they are not previously installed.




 Thanks for your help.
 Spencer

--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





--
Assistant Professor / Dobelman Family Junior Chair
Department of Statistics / Rice University
http://had.co.nz/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] par(ask=TRUE) in R CMD check?

2011-03-12 Thread Spencer Graves

Hello:


  What happens in the auto-checks on R-Forge and CRAN with code 
using par(ask=TRUE)?



  Is this routine, or can it create problems?


  The fda package uses ask=TRUE to provide the user with a way to 
examine a group of plots.  In the past, I've marked those tests in 
\examples with \dontrun.  However, I wonder if that is necessary.  I 
tried it on Windows using R 2.12.0 and R Tools from that version, and R 
Tools seemed to supply all the mouse clicks required.  However, before I 
"SVN Commit" to R-Forge, I felt a need to ask.



  Thanks,
  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Rtools questions

2011-04-05 Thread Spencer Graves

Hello:


  1.  How can I tell when the development version of Rtools has 
changed?  For the past few years, I've installed the development version 
of R tools with each new release of R.  I encountered problems with this 
a few days ago, so I rolled back to Rtools212.exe.  Unfortunately, I 
seem to have more problems with that version.  My latest install was 
under Windows 7 Home Edition.  My previous problems were on Vista, but I 
also have access to Fedora 13 Linux.



  2.  "R CMD check" ends with the following:


* checking examples ... OK
* checking PDF version of manual ... WARNING
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
* checking PDF version of manual without hyperrefs or index ... ERROR
Re-running with no redirection of stdout/stderr.
Hmm ... looks like a package
Error in texi2dvi("Rd2.tex", pdf = (out_ext == "pdf"), quiet = FALSE,  :
  unable to run 'pdflatex' on 'Rd2.tex'
Error in running tools::texi2dvi
You may want to clean up by 'rm -rf 
C:/Users/sgraves/AppData/Local/Temp/Rtmpr6z3

r6/Rd2pdf55b96c9a'


  This is using Rtools213, downloaded April 4 from 
"www.murdoch-sutherland.com/Rtools" with R installed as follows:




sessionInfo()

R version 2.12.2 (2011-02-25)
Platform: x86_64-pc-mingw32/x64 (64-bit)

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base



  Thanks,
  Spencer


--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rtools questions

2011-04-05 Thread Spencer Graves

On 4/5/2011 5:01 PM, Gabor Grothendieck wrote:

On Tue, Apr 5, 2011 at 7:51 PM, Henrik Bengtsson  wrote:

On Tue, Apr 5, 2011 at 3:44 PM, Duncan Murdoch  wrote:

On 11-04-05 6:22 PM, Spencer Graves wrote:

Hello:


1.  How can I tell when the development version of Rtools has
changed?

I don't make announcements of the changes, you just need to check the web
site.  There are online tools that can do this for you automatically, but I
don't know which one to recommend.  Google suggests lots of them.

I also asked myself this before and I must admit it took me a while to
interpret the contents of the webpage.  There are multiple sections,
e.g. 'Changes since R 2.12.2', 'Changes since R 2.11.1', 'Changes
since R 2.11.0', and so on.  Then within each section there are some
dates mentioned.  Given my current R version (say R 2.13.0 beta) and
Rtools (Rtools213.exe), it not fully clear to me which section to look
at, e.g. 'Changes since R 2.12.2'?  It might be more clear if there
instead the sections would be 'Changes in Rtools213', 'Changes in
Rtools212' and so on, and within each maybe list updates by
dates/version.  More like a NEWS file.  Then it would be easier to see
if there is an updated available or not.  Even a NEWS file only
available as part of the installation will help decide whether the
version you have installed differ from the one available online.
Something like the following:

== Changes in Rtools213 ==

[...]


== Changes in Rtools212 ==

2011-03-25:
- Rtools 2.12 has been frozen.
- We have updated all of the tools to current Cygwin versions as of
March 25, 2011. We added the "du" utility from Cygwin. We have dropped
Vanilla Perl. The libjpeg version has been updated to 8c, and libpng
has been updated to 1.5.1.

2010-10-18: [v2.12.0.1892]<== Is this an Rtools version?!?
- Prior to October 18, 2010, builds of Rtools212.exe did not correctly
install the "extras" required to build R. Version 2.12.0.1892 or later
should fix this.
- We have now updated all of the tools to current Cygwin versions, and
have updated the compilers, and included the 64 bit compilers into
Rtools. See Prof. Ripley's page for the details.
- Perl is rarely needed in R since R 2.12.0, so it is by default not installed.

2010-??-??:
- The 32 bit version of R-devel (to become R 2.12.0 in fall, 2010)
will be built with gcc 4.5.x, so Rtools212 contains a completely new
MinGW toolchain based on gcc 4.5.0.

== Changes in Rtools211 ==

[...]


Just a suggestion ...and thanks for providing Rtools!

/Henrik

If a NEWS file were included in the Rtools distribution itself (and
not just on the web site) it would be helpful since its not always
clear which version you have on your system in the first place.


  However, adding a NEWS file increases the labor, and I'd be happy 
letting Duncan and others continue doing what they do without asking 
them to take the time to tell the rest of us what they did.



  Something simpler would suffice for my needs, e.g., a revision 
number in the name of the download file, like Rtools213.5107.exe for SVN 
revision number 5107.  Windows 7 gives me the date my copy was 
downloaded, not the date of the last patch.  On March 31, I downloaded 
and installed basic-miktex-2.9.3972.exe from 
"http://miktex.org/2.9/setup";.  Today, I downloaded
basic-miktex-2.9.4106.exe and basic-miktex-2.9.4106-x64.exe.  From 
comparing names, I inferred (a) the first was a newer version of what I 
had previously installed, and (b) that was 32 bit and the other is 64 
bit.  I installed the latter, and the problem with pdflatex disappeared.



  Spencer

--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rtools questions

2011-04-05 Thread Spencer Graves

On 4/5/2011 6:03 PM, Gabor Grothendieck wrote:

On Tue, Apr 5, 2011 at 8:58 PM, Spencer Graves
  wrote:

On 4/5/2011 5:01 PM, Gabor Grothendieck wrote:

On Tue, Apr 5, 2011 at 7:51 PM, Henrik Bengtsson
  wrote:

On Tue, Apr 5, 2011 at 3:44 PM, Duncan Murdoch
  wrote:

On 11-04-05 6:22 PM, Spencer Graves wrote:

Hello:


1.  How can I tell when the development version of Rtools has
changed?

I don't make announcements of the changes, you just need to check the
web
site.  There are online tools that can do this for you automatically,
but I
don't know which one to recommend.  Google suggests lots of them.

I also asked myself this before and I must admit it took me a while to
interpret the contents of the webpage.  There are multiple sections,
e.g. 'Changes since R 2.12.2', 'Changes since R 2.11.1', 'Changes
since R 2.11.0', and so on.  Then within each section there are some
dates mentioned.  Given my current R version (say R 2.13.0 beta) and
Rtools (Rtools213.exe), it not fully clear to me which section to look
at, e.g. 'Changes since R 2.12.2'?  It might be more clear if there
instead the sections would be 'Changes in Rtools213', 'Changes in
Rtools212' and so on, and within each maybe list updates by
dates/version.  More like a NEWS file.  Then it would be easier to see
if there is an updated available or not.  Even a NEWS file only
available as part of the installation will help decide whether the
version you have installed differ from the one available online.
Something like the following:

== Changes in Rtools213 ==

[...]


== Changes in Rtools212 ==

2011-03-25:
- Rtools 2.12 has been frozen.
- We have updated all of the tools to current Cygwin versions as of
March 25, 2011. We added the "du" utility from Cygwin. We have dropped
Vanilla Perl. The libjpeg version has been updated to 8c, and libpng
has been updated to 1.5.1.

2010-10-18: [v2.12.0.1892]<== Is this an Rtools version?!?
- Prior to October 18, 2010, builds of Rtools212.exe did not correctly
install the "extras" required to build R. Version 2.12.0.1892 or later
should fix this.
- We have now updated all of the tools to current Cygwin versions, and
have updated the compilers, and included the 64 bit compilers into
Rtools. See Prof. Ripley's page for the details.
- Perl is rarely needed in R since R 2.12.0, so it is by default not
installed.

2010-??-??:
- The 32 bit version of R-devel (to become R 2.12.0 in fall, 2010)
will be built with gcc 4.5.x, so Rtools212 contains a completely new
MinGW toolchain based on gcc 4.5.0.

== Changes in Rtools211 ==

[...]


Just a suggestion ...and thanks for providing Rtools!

/Henrik

If a NEWS file were included in the Rtools distribution itself (and
not just on the web site) it would be helpful since its not always
clear which version you have on your system in the first place.

  However, adding a NEWS file increases the labor, and I'd be happy
letting Duncan and others continue doing what they do without asking them to
take the time to tell the rest of us what they did.


  Something simpler would suffice for my needs, e.g., a revision number

That wouldn't let you know what version has been installed after installation.


It worked for me today with MiKTeX, because I kept the previous 
installer in a place where I could identify it with what I installed.  
That's not as user friendly as NEWS, but it doesn't ask Duncan and 
anyone else who works on Rtools to rearrange their priorities to 
document something that I rarely read.  (Even if I RTFM 24/7, I can't 
read fast enough to keep up with the changes and additions to TFM.)



  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Rtools questions

2011-04-06 Thread Spencer Graves

Hello:


  1.  How can I tell when the development version of Rtools has 
changed?  For the past few years, I've installed the development version 
of R tools with each new release of R.  I encountered problems with this 
a few days ago, so I rolled back to Rtools212.exe.  Unfortunately, I 
seem to have more problems with that version.  My latest install was 
under Windows 7 Home Edition.  My previous problems were on Vista, but I 
also have access to Fedora 13 Linux.



  2.  "R CMD check" ends with the following:


* checking examples ... OK
* checking PDF version of manual ... WARNING
LaTeX errors when creating PDF version.
This typically indicates Rd problems.
* checking PDF version of manual without hyperrefs or index ... ERROR
Re-running with no redirection of stdout/stderr.
Hmm ... looks like a package
Error in texi2dvi("Rd2.tex", pdf = (out_ext == "pdf"), quiet = FALSE,  :
  unable to run 'pdflatex' on 'Rd2.tex'
Error in running tools::texi2dvi
You may want to clean up by 'rm -rf 
C:/Users/sgraves/AppData/Local/Temp/Rtmpr6z3

r6/Rd2pdf55b96c9a'


  This is using Rtools213, downloaded April 4 from 
"www.murdoch-sutherland.com/Rtools" with R installed as follows:



> sessionInfo()
R version 2.12.2 (2011-02-25)
Platform: x86_64-pc-mingw32/x64 (64-bit)

locale:
[1] LC_COLLATE=English_United States.1252
[2] LC_CTYPE=English_United States.1252
[3] LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C
[5] LC_TIME=English_United States.1252

attached base packages:
[1] stats graphics  grDevices utils datasets  methods   base



  Thanks,
  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rtools questions

2011-04-06 Thread Spencer Graves

On 4/5/2011 5:01 PM, Gabor Grothendieck wrote:

On Tue, Apr 5, 2011 at 7:51 PM, Henrik Bengtsson  wrote:

On Tue, Apr 5, 2011 at 3:44 PM, Duncan Murdoch  wrote:

On 11-04-05 6:22 PM, Spencer Graves wrote:

Hello:


1.  How can I tell when the development version of Rtools has
changed?

I don't make announcements of the changes, you just need to check the web
site.  There are online tools that can do this for you automatically, but I
don't know which one to recommend.  Google suggests lots of them.

I also asked myself this before and I must admit it took me a while to
interpret the contents of the webpage.  There are multiple sections,
e.g. 'Changes since R 2.12.2', 'Changes since R 2.11.1', 'Changes
since R 2.11.0', and so on.  Then within each section there are some
dates mentioned.  Given my current R version (say R 2.13.0 beta) and
Rtools (Rtools213.exe), it not fully clear to me which section to look
at, e.g. 'Changes since R 2.12.2'?  It might be more clear if there
instead the sections would be 'Changes in Rtools213', 'Changes in
Rtools212' and so on, and within each maybe list updates by
dates/version.  More like a NEWS file.  Then it would be easier to see
if there is an updated available or not.  Even a NEWS file only
available as part of the installation will help decide whether the
version you have installed differ from the one available online.
Something like the following:

== Changes in Rtools213 ==

[...]


== Changes in Rtools212 ==

2011-03-25:
- Rtools 2.12 has been frozen.
- We have updated all of the tools to current Cygwin versions as of
March 25, 2011. We added the "du" utility from Cygwin. We have dropped
Vanilla Perl. The libjpeg version has been updated to 8c, and libpng
has been updated to 1.5.1.

2010-10-18: [v2.12.0.1892]<== Is this an Rtools version?!?
- Prior to October 18, 2010, builds of Rtools212.exe did not correctly
install the "extras" required to build R. Version 2.12.0.1892 or later
should fix this.
- We have now updated all of the tools to current Cygwin versions, and
have updated the compilers, and included the 64 bit compilers into
Rtools. See Prof. Ripley's page for the details.
- Perl is rarely needed in R since R 2.12.0, so it is by default not installed.

2010-??-??:
- The 32 bit version of R-devel (to become R 2.12.0 in fall, 2010)
will be built with gcc 4.5.x, so Rtools212 contains a completely new
MinGW toolchain based on gcc 4.5.0.

== Changes in Rtools211 ==

[...]


Just a suggestion ...and thanks for providing Rtools!

/Henrik

If a NEWS file were included in the Rtools distribution itself (and
not just on the web site) it would be helpful since its not always
clear which version you have on your system in the first place.


  However, adding a NEWS file increases the labor, and I'd be happy 
letting Duncan and others continue doing what they do without asking 
them to take the time to tell the rest of us what they did.



  Something simpler would suffice for my needs, e.g., a revision 
number in the name of the download file, like Rtools213.5107.exe for SVN 
revision number 5107.  Windows 7 gives me the date my copy was 
downloaded, not the date of the last patch.  On March 31, I downloaded 
and installed basic-miktex-2.9.3972.exe from 
"http://miktex.org/2.9/setup";.  Today, I downloaded
basic-miktex-2.9.4106.exe and basic-miktex-2.9.4106-x64.exe.  From 
comparing names, I inferred (a) the first was a newer version of what I 
had previously installed, and (b) that was 32 bit and the other is 64 
bit.  I installed the latter, and the problem with pdflatex disappeared.



  Spencer

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Wish there were a "strict mode" for R interpreter. What about You?

2011-04-09 Thread Spencer Graves

On 4/9/2011 2:31 PM, Hadley Wickham wrote:

On Sat, Apr 9, 2011 at 2:51 PM, Paul Johnson  wrote:

Years ago, I did lots of Perl programming. Perl will let you be lazy
and write functions that refer to undefined variables (like R does),
but there is also a strict mode so the interpreter will block anything
when a variable is mentioned that has not been defined. I wish there
were a strict mode for checking R functions.

Here's why. We have a lot of students writing R functions around here
and they run into trouble because they use the same name for things
inside and outside of functions. When they call functions that have
mistaken or undefined references to names that they use elsewhere,
then variables that are in the environment are accidentally used. Know
what I mean?

dat<- whatever

someNewFunction<- function(z, w){
   #do something with z and w and create a new "dat"
   # but forget to name it "dat"
lm (y, x, data=dat)
   # lm just used wrong data
}

I wish R had a strict mode to return an error in that case. Users
don't realize they are getting nonsense because R finds things to fill
in for their mistakes.

Is this possible?  Does anybody agree it would be good?



library(codetools)
checkUsage(someNewFunction)

: no visible binding for global variable ‘y’
: no visible binding for global variable ‘x’
: no visible binding for global variable ‘dat’

Which also picks up another bug in your function ;)


  Is this run by "R CMD check"?  I've seen this message.


  "R CMD check" will give this message sometimes when I don't feel 
it's appropriate.  For example, I define a data object ETB in a package, 
then give that as the default in a function call like 
f(data.=ETB){if(missing(data.))data(ETB);  data.}.  When I run "R CMD 
check", I get "no visible binding for global variable 'ETB'", even 
though the function is tested and works during R CMD check.



  Spencer


Hadley




--
Spencer Graves, PE, PhD
President and Chief Operating Officer
Structure Inspection and Monitoring, Inc.
751 Emerson Ct.
San José, CA 95126
ph:  408-655-4567

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


  1   2   3   4   >