Re: [Rd] Chunk of text won't show up when compiling Rd file
On Mon, 9 Mar 2009, Ben Bryant wrote: Greetings - Thanks for the response and apologies for the delay. I was actually unable to get even the example script for Rd2HTML to work in 2.9.0dev, which may be due to my lack of general programming savvy, or possibly my working on a windows machine? So, no information on what went wrong. Your example works for me in R-devel, even on Windows. In the meantime I found a workaround to address my immediate needs (just including another section in the Rd file). The most helpful I can be with my current level of knowledge is to include a full Rd text that reproduces the error, if someone would like to give it a shot. (below). \value sections are somewhat special. In versions < 2 of Rd format it starts out as a standard text section but the first \item macro turns it into a \describe list. And this *is* how they are documented: 'If a list with multiple values is returned, you can use entries of the form \item{comp_i}{Description of comp_i.} for each component of the list returned. Optional text may precede this list (see the introductory example for rle).' I take that to mean that text can only *precede* a list. You are expecting something to work that is AFAICS undocumented and not intended to work. In the current Rd2 conversion additional text after \item sections terminates the list, so this behaves more as you want (as a set of lists). However, Rd2 conversion is not released as yet, and almost certainly will not be in 2.9.0. (I don't think the change in behaviour has yet been agreed by R-core.) Thanks, -Ben %FAKE FUNCTION DOCUMENTATION TO ILLUSTRATE PROBLEM \name{fake} \alias{fake} \title{Fake function documentation} \description{This is a sample to show a possible bug in the Rd compiler, which may actually be generally desirable behavior, but behavior that is encoded in a somewhat opaque way. See the Value section for what is going on. } \usage{ sdprim(x, y = NULL) } \arguments{ \item{x}{The usual inputs.} \item{y}{The usual outputs.} } \details{ A good bit of text on the details. } \value{ Here I have a paragraph giving the general description of the output form. Then I have an itemized list describing the elements. \item{listobject1}{Description of list object 1} \item{listobject2}{Description of list object 2} %... etc \item{lastlistobject}{Description of the last list object} THEN, here I have a general description of the attributes, and the text represented by this sentence is what doesn't show up, because it's in between more of an itemized list. \item{attribute1}{details of attribute 1} \item{attribute2}{details of attribute 2} Then I have text here, and this text does show up. } \author{anonymous} \examples{ #are not too relevant. } \keyword{robust} On Thu, Mar 5, 2009 at 5:49 PM, Duncan Murdoch wrote: On 05/03/2009 12:29 PM, Ben Bryant wrote: Greetings - I am trying to document the "value" section of a function. The function returns a list, but the list itself also has attributes. I would like to itemize the list entries, and itemize the attributes, but in between I would like to have a sentence or two about the attributes in general. However, for some reason this intermediate sentence won't show up in the compiled version, so that it appears the attributes are all just elements in the returned list. Something is making the assumption that the itemized list must be uninterrupted, and I don't know the code to tell it not to do that. I presume it is a very easy fix, but I haven't been able to get at it. I pasted some example explanatory Rd code below. Thanks! -Ben Bryant Could you give your example a try in R-devel, with one of the new conversion functions, e.g. tools::Rd2HTML? I don't think these new functions are used by default even in R-devel, but if they solve your problem, there will be less motivation to fix the legacy functions. Duncan Murdoch % Just the Value Section: \value{ Here I have a paragraph giving the general description of the output form. Then I have an itemized list describing the elements. \item{listobject1}{Description of list object 1} \item{listobject2}{Description of list object 2} %... etc \item{lastlistobject}{Description of the last list object} THEN, here I have a general description of the attributes, and the text represented by this sentence is what doesn't show up, because it's in between more of an itemized list. \item{attribute1}{details of attribute 1} \item{attribute 2}{details of attribute 2} Then I have text here, and this text does show up. } [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] r-devel tarball build failure on windows
Hi, On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I have cygwin on my path) -make all recommended- for the latest R-devel tarball (svn revision: 48093) fails when trying to build the recommended packages: --- Making recommended packages - installing recommended package KernSmooth Warning: invalid package 'KernSmooth.tgz' Error: ERROR: no packages specified make[1]: *** [KernSmooth.ts] Error 1 make: *** [recommended] Error 2 Looking at R_HOME/src/library/Recommended shows that none of the timestamp files .ts are generated(?) in that directory. If I manually create the empty timestamp files, build completes but -make check- failed. I suspect something with my build tools is not right but what could that be? h. -- +--- | Hiroyuki Kawakatsu | Business School, Dublin City University | Dublin 9, Ireland. Tel +353 (0)1 700 7496 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] r-devel tarball build failure on windows
Hiroyuki Kawakatsu wrote: Hi, On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I have cygwin on my path) -make all recommended- for the latest R-devel tarball (svn revision: 48093) fails when trying to build the recommended packages: 1. Have you asked make rsync-recommended before (i.e. are the packages actually there)? 2. If so, please install the cygwin dlls and try to remove cygwin from your path. The may very well be some version conflicts in I cannot build R / R packages if a full cygwin installation is around. Uwe Ligges --- Making recommended packages - installing recommended package KernSmooth Warning: invalid package 'KernSmooth.tgz' Error: ERROR: no packages specified make[1]: *** [KernSmooth.ts] Error 1 make: *** [recommended] Error 2 Looking at R_HOME/src/library/Recommended shows that none of the timestamp files .ts are generated(?) in that directory. If I manually create the empty timestamp files, build completes but -make check- failed. I suspect something with my build tools is not right but what could that be? h. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] S4 generic masking S3 generic when using namespace
Hi, I have two example packages, test1 and test2, where the only code in them is: setGeneric("predict", function(object, ...) standardGeneric("predict")) (get them from http://www.cs.mu.oz.au/~gabraham/test1.tar and http://www.cs.mu.oz.au/~gabraham/test2.tar) The difference between them is that first does not have a namespace, and loads fine. The second has a namespace but generates a warning: > library(test2) Attaching package: 'test2' The following object(s) are masked from package:stats : predict Is this an intended behaviour? If I ignore this masking warning, do I risk unintended consequences later down the track? Thanks, Gad > sessionInfo() R version 2.8.1 (2008-12-22) i386-apple-darwin8.11.1 locale: en_AU.UTF-8/en_AU.UTF-8/C/C/en_AU.UTF-8/en_AU.UTF-8 attached base packages: [1] stats graphics grDevices utils datasets methods base other attached packages: [1] test2_0.1 -- Gad Abraham MEng Student, Dept. CSSE and NICTA The University of Melbourne Parkville 3010, Victoria, Australia email: gabra...@csse.unimelb.edu.au web: http://www.csse.unimelb.edu.au/~gabraham __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [boot] bootstrap issue when at least one strata has only one (PR#13586)
Full_Name: Dominique Soudant Version: 2.4.1 OS: Winbdows Submission from: (NULL) (134.246.54.61) R 2.4.1 boot 1.2-27 Let us consider the following example with 8 strata, one observation for each : > library(boot) > df <- data.frame(Values=runif(8),month=1:8) > df Values month 1 0.02721540 1 2 0.13618392 2 3 0.99979095 3 4 0.80441083 4 5 0.42374115 5 6 0.04928531 6 7 0.34387447 7 8 0.13277458 8 > P90 <- function(DataIn,i){ + DataIn <- DataIn[i,] + P90 <- round(quantile(as.numeric(DataIn$Values) + ,probs=0.9 + ,names=FALSE + ,type=4) + ,digits=1) + return(P90) + } > > bootP90 <- boot(df,P90,10,strata=df$month) > bootP90$t [,1] [1,] 0.2 [2,] 0.5 [3,] 0.8 [4,] 0.2 [5,] 0.2 [6,] 1.0 [7,] 0.5 [8,] 1.0 [9,] 0.8 [10,] 1.0 Results are differents, they should be equal. I guess that the issue is coming from the function ordinary.array() from the boot package : ordinary.array <- function (n, R, strata) { output <- matrix(0, R, n) inds <- as.integer(names(table(strata))) for (is in inds) { gp <- c(1:n)[strata == is] output[, gp] <- matrix(sample(gp, R * length(gp), replace = TRUE), nrow = R) } output } > ordinary.array(8,10,1:8) [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [1,]11221152 [2,]11223325 [3,]12142117 [4,]11325536 [5,]11245371 [6,]12244618 [7,]11224275 [8,]12323417 [9,]12211312 [10,]11245115 In ordinary.array(), the issue is coming from sample() : [quote]If x has length 1 and x >= 1, sampling takes place from 1:x[/quote] Note that when only one strata has only one observation, the problem is identical > ordinary.array(7,10,c(rep(1:3,each=2),4)) [,1] [,2] [,3] [,4] [,5] [,6] [,7] [1,]1233656 [2,]2234557 [3,]2244655 [4,]1144651 [5,]2233657 [6,]1234551 [7,]1243665 [8,]2134562 [9,]2234664 [10,]2243652 but less detectable in the results. The same results are obtained with : R 2.8.1 boot 1.2-35 Best regards. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] S4 generic masking S3 generic when using namespace
Try using setGeneric("predict") without further arguments, this should work as it will take the existing 'predict' definition and convert it into S4 generic. This works nicely for me for all plot, print etc methods * R *** R 2.9.0 (svn -r 47821) [/share/research/R-devel/20090203/lib64/R] *** > setGeneric("predict") [1] "predict" > predict standardGeneric for "predict" defined from package "stats" function (object, ...) standardGeneric("predict") Methods may be defined for arguments: object Use showMethods("predict") for currently available ones. Dr Oleg Sklyar Research Technologist AHL / Man Investments Ltd +44 (0)20 7144 3107 oskl...@maninvestments.com > -Original Message- > From: r-devel-boun...@r-project.org > [mailto:r-devel-boun...@r-project.org] On Behalf Of Gad Abraham > Sent: 10 March 2009 10:02 > To: R-devel@r-project.org > Subject: [Rd] S4 generic masking S3 generic when using namespace > > Hi, > > I have two example packages, test1 and test2, where the only code in > them is: > > setGeneric("predict", function(object, ...) > standardGeneric("predict")) > > (get them from http://www.cs.mu.oz.au/~gabraham/test1.tar and > http://www.cs.mu.oz.au/~gabraham/test2.tar) > > The difference between them is that first does not have a > namespace, and > loads fine. The second has a namespace but generates a warning: > > library(test2) > > Attaching package: 'test2' > > > The following object(s) are masked from package:stats : > >predict > > > > Is this an intended behaviour? If I ignore this masking warning, do I > risk unintended consequences later down the track? > > Thanks, > Gad > > > > > sessionInfo() > R version 2.8.1 (2008-12-22) > i386-apple-darwin8.11.1 > > locale: > en_AU.UTF-8/en_AU.UTF-8/C/C/en_AU.UTF-8/en_AU.UTF-8 > > attached base packages: > [1] stats graphics grDevices utils datasets methods base > > other attached packages: > [1] test2_0.1 > > > -- > Gad Abraham > MEng Student, Dept. CSSE and NICTA > The University of Melbourne > Parkville 3010, Victoria, Australia > email: gabra...@csse.unimelb.edu.au > web: http://www.csse.unimelb.edu.au/~gabraham > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > ** Please consider the environment before printing this email or its attachments. The contents of this email are for the named addressees ...{{dropped:19}} __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] r-devel tarball build failure on windows
Uwe Ligges wrote: > > Hiroyuki Kawakatsu wrote: >> Hi, >> >> On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I >> have cygwin on my path) -make all recommended- for the latest R-devel >> tarball (svn revision: 48093) fails when trying to build the >> recommended packages: > > > 1. Have you asked make rsync-recommended before (i.e. are the packages > actually there)? > > 2. If so, please install the cygwin dlls and try to remove cygwin from > your path. The may very well be some version conflicts in I cannot build > R / R packages if a full cygwin installation is around. > > Uwe Ligges This bit apjaworski last week, but the naughty boy didn't include R-devel in the discussion It boils down to problems with symlink handling. You unpack the tar file and the .tgz links look like ordinary files with strange contents to other tools. The workaround is to run make Rpwd.exe make link-recommended (or, maybe, to unpack with a different tar version, but I really don't know). > > >> >> --- Making recommended packages >> >> - installing recommended package KernSmooth >> Warning: invalid package 'KernSmooth.tgz' >> Error: ERROR: no packages specified >> make[1]: *** [KernSmooth.ts] Error 1 >> make: *** [recommended] Error 2 >> >> Looking at R_HOME/src/library/Recommended shows that none of the >> timestamp files .ts are generated(?) in that directory. If I manually >> create the empty timestamp files, build completes but -make check- >> failed. I suspect something with my build tools is not right but what >> could that be? >> >> h. > > __ > R-devel@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - (p.dalga...@biostat.ku.dk) FAX: (+45) 35327907 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] suggestion/request: install.packages and unnecessary file modifications
Dear R-devel When 'install.packages' runs, it updates all html files in all packages. Mostly, there seems to be no actual change to the html file contents, but the date/time does change. This has causing been me a bit of trouble, because I keep synchronized versions of R on several different machines, and whenever I install a package, many MB of file transfers are required; my slow upload link can't cope. The culprit appears to be 'utils:::fixup.package.URLs'. Adding the commented lines below, near the end of the function, avoids the unnecessary rewrite. Mark Bravington CSIRO Hobart Australia for (f in files) { page <- readLines(f) old.page <- page # MVB page <- gsub(olddoc, doc, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldbase, base, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldutils, utils, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldgraphics, graphics, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldstats, stats, page, fixed = TRUE, useBytes = TRUE) page <- gsub(olddata, datasets, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldgrD, grD, page, fixed = TRUE, useBytes = TRUE) page <- gsub(oldmeth, meth, page, fixed = TRUE, useBytes = TRUE) if( identical( page, old.page)) # MVB next # MVB out <- try(file(f, open = "w"), silent = TRUE) if (inherits(out, "try-error")) { warning(gettextf("cannot update '%s'", f), domain = NA) next } writeLines(page, out) close(out) } return(TRUE) } __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [SoC09-Idea] cranlab.
Hi everybody, just another Google Summer of Code project idea. Best, Manuel. -- cranlab -- "You can't control what you can't measure" [0] Mentor: Manuel J. A. Eugster Summary: The aim of this project is the (1) implementation of software metrics to analyze R packages and (2) the creation of a CRAN software metrics monitor. Required skills: Good R programming skills. Basic knowledge of software engineering and software metrics measurements are useful. Description: Software metrics are measures of some properties of software. In software engineering they are used to monitor improvement of projects; common metrics include 'source lines of code', 'code coverage' or 'software package metrics' (see, e.g., [1]). First step of this project is the implementation of an R package which calculates software metrics of R packages. The implementation must be flexible, i.e., a basic set of metrics will be implemented, but others can be added later on. Second step is the creation of a CRAN software metrics monitor. This means a service which continuously calculates software metrics of CRAN packages and provides the (raw) data. As a first analyzing step a dashboard provides simple basic plots of the data. Programming exercise: How many functions has the archetypes package [2]? Write some R code which counts them. [0] Tom DeMarco. Controlling Software Projects: Management, Measurement and Estimation [1] http://en.wikipedia.org/wiki/Software_metrics [2] http://cran.at.r-project.org/web/packages/archetypes/ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [SoC09-Idea] Integrated debugger
Hello, Hello, Here is an idea for a google summer of code project I am willing to mentor. Romain Summary: Create an integrated debugger. Required skills: R skills. Experience of using a debugger. Front-end skills depending on the chosen front-end(s). Description: Debugging R code usually involves a lot of work from the command line with the use of functions such as browser, debug, trace, recover. The debug package provides additional debugging functionalities and concepts to R internal debugging capabilities: code display, graceful error recovery, line-numbered conditional breakpoints, access to exit code, flow control, and full keyboard input. The current front-end used by the debug package is based on tcltk, and although tcltk offers universal portability wherever R is installed, it does not compete with current alternatives in terms of user-experience. The goal of this project is to create an integrated debugger for R, based on the debug package but coupled with another front-end. Possible front-ends are listed below, ordered by current experience of the mentor. - biocep [java,swing] : http://biocep-distrib.r-forge.r-project.org/ - sciviews-k [mozilla,javascript] http://www.sciviews.org/SciViews-K/index.html - statet [java,eclipse,swt]: http://www.walware.de/goto/statet If you are interested in the project: I you are coming from an R standpoint, have a look at the debug package and make a few design suggestions about how the package could be modified to support alternative front-ends. If you come from a front-end standpoint, make a few suggestions on how you would present the information. -- Romain Francois Independent R Consultant +33(0) 6 28 91 30 30 http://romainfrancois.blog.free.fr __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] A Design Error (Re: S4 objects for S3 methods)
"JC" == John Chambers on Mon, 09 Mar 2009 09:53:06 -0700 JC> As Yohan points out, and as we found in testing CRAN packages, JC> there are JC> a number of examples where programmers have written S3 methods JC> for S4 JC> classes, such as print.aTest() below. JC> JC> This may well have seemed an easy and convenient mechanism, JC> but it is a JC> design error with potentially disastrous consequences. JC> Note that this JC> is fundamentally different from an S3 method defined for an JC> S3 class and JC> inherited by an S4 class that extends the S3 class. JC> JC> We haven't decided whether to re-enable this, but if so it JC> will be with JC> a warning at user call and/or at package check time. JC> JC> Please turn such methods into legitimate S4 methods. JC> Usually the change JC> is a simple call to setMethod(); in some cases, such as this JC> example you JC> may need a bit more work, such as a show() method to call JC> the print.* JC> function. JC> JC> DETAILS: JC> JC> There are at least two serious flaws in this mechanism, plus JC> some minor JC> defects. JC> JC> 1. S3 dispatch does not know about S4 class inheritance. JC> So if as a user of Yohan's code, for example, I define a class JC> setClass(bTest, contains = aTest) JC> then that class will not inherit any of the S3 methods. JC> In the case of JC> print, that will be obvious. The disaster waiting to happen JC> is when the JC> method involves numerical results, in which case I may be JC> getting JC> erroneous results, with no hint of the problem. JC> JC> 2. Conversely, S4 dispatch does not know about the S3 method. JC> So, if my new class was: JC> setClass(bTest, contains = c(aTest, waffle7) JC> suppose waffle7 has some huge inheritance, in the midst of JC> which is a JC> method for a generic function of importance. I expect to JC> inherit the JC> method for aTest because it's for a direct superclass, but I JC> won't, no JC> matter how far up the tree the other method occurs. (Even if JC> point 1 JC> were fixed) JC> JC> Again, this would be obvious for a print method, but potentially JC> disastrous elsewhere. JC> JC> There are other minor issues, such as efficiency: the S3 method JC> requires two dispatches and perhaps may do some extra copying. JC> But 1 JC> and 2 are the killers. JC> JC> Just to anticipate a suggestion: Yes, we could perhaps fix 1 JC> by adding JC> code to the S3 dispatch, but this would ambiguate the legitimate JC> attempt JC> to handle inherited valid S3 methods correctly. JC> JC> John Dear John, Thank you for the detailed explanation. I completely understand that it is a design error and that it should be fixed. As you said it is a matter of using 'setMethod'. My main concern is that such a change happens one month before a new release. This is very little time for developers to make their packages consistent with the new release. As a side note, I have noticed that it is still possible to define S3 methods for S4 classes which do not contains a super-class like matrix. But the disastrous consequences that you explained would still be possible in my opinion. setClass("aTest", representation(.Data = "matrix", comment = "character")) a <- new("aTest", .Data = matrix(1:4, ncol = 2), comment = "aTest") as.matrix.aTest <- function(x, ...) getDataPart(x) as.matrix(a) # returns same S4 object # but setClass("bTest", representation(Data = "matrix", comment = "character")) b <- new("bTest", Data = matrix(1:4, ncol = 2), comment = "hello") as.matrix.bTest <- function(x, ...) b...@data as.matrix(b) # does work Are you planing to turn this off too? Again, I understand that developers should conform with the S4 style but a one month notice is very short in my opinion. Moreover adding a warning in R CMD check would be the same because developer won't be able to submit packages since CRAN only accepts packages which pass R CMD check without warnings. Best regards, Yohan -- PhD student Swiss Federal Institute of Technology Zurich www.ethz.ch __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] r-devel tarball build failure on windows
Peter Dalgaard wrote: Uwe Ligges wrote: Hiroyuki Kawakatsu wrote: Hi, On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I have cygwin on my path) -make all recommended- for the latest R-devel tarball (svn revision: 48093) fails when trying to build the recommended packages: 1. Have you asked make rsync-recommended before (i.e. are the packages actually there)? 2. If so, please install the cygwin dlls and try to remove cygwin from your path. The may very well be some version conflicts in I cannot build R / R packages if a full cygwin installation is around. Uwe Ligges This bit apjaworski last week, but the naughty boy didn't include R-devel in the discussion It boils down to problems with symlink handling. You unpack the tar file and the .tgz links look like ordinary files with strange contents to other tools. The workaround is to run make Rpwd.exe make link-recommended (or, maybe, to unpack with a different tar version, but I really don't know). Ah, sure, thanks, I always build from svn sources and hence say make rsync-recommended make recommended If you omit make rsync-recommended you will need at least make link-recommended which is in fact the same but without the rsync step. Uwe --- Making recommended packages - installing recommended package KernSmooth Warning: invalid package 'KernSmooth.tgz' Error: ERROR: no packages specified make[1]: *** [KernSmooth.ts] Error 1 make: *** [recommended] Error 2 Looking at R_HOME/src/library/Recommended shows that none of the timestamp files .ts are generated(?) in that directory. If I manually create the empty timestamp files, build completes but -make check- failed. I suspect something with my build tools is not right but what could that be? h. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] surprising behaviour of names<-
playing with 'names<-', i observed the following: x = 1 names(x) # NULL 'names<-'(x, 'foo') # c(foo=1) names(x) # NULL where 'names<-' has a functional flavour (does not change x), but: x = 1:2 names(x) # NULL 'names<-'(x, 'foo') # c(foo=1, 2) names(x) # "foo" NA where 'names<-' seems to perform a side effect on x (destructively modifies x). furthermore: x = c(foo=1) names(x) # "foo" 'names<-'(x, NULL) names(x) # NULL 'names<-'(x, 'bar') names(x) # "bar" !!! x = c(foo=1) names(x) # "foo" 'names<-'(x, 'bar') names(x) # "bar" !!! where 'names<-' is not only able to destructively remove names from x, but also destructively add or modify them (quite unlike in the first example above). analogous code but using 'dimnames<-' on a matrix performs a side effect on the matrix even if it initially does not have dimnames: x = matrix(1,1,1) dimnames(x) # NULL 'dimnames<-'(x, list('foo', 'bar')) dimnames(x) # list("foo", "bar") this is incoherent with the first example above, in that in both cases the structure initially has no names or dimnames attribute, but the end result is different in the two examples. is there something i misunderstand here? there is another, minor issue with names: 'names<-'(1, c('foo', 'bar')) # error: 'names' attribute [2] must be the same length as the vector [1] 'names<-'(1:2, 'foo') # no error since ?names says that "If 'value' is shorter than 'x', it is extended by character 'NA's to the length of 'x'" (where x is the vector and value is the names vector), the error message above should say that the names attribute must be *at most*, not *exactly*, of the length of the vector. regards, vQ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] ?as.POSIXct (PR#13587)
Full_Name: Luca Braglia Version: 2.8 OS: Windows Submission from: (NULL) (85.18.136.110) >From ?as.POSIXct ## SPSS dates (R-help 2006-02-17) z <- c(10485849600, 10477641600, 10561104000, 10562745600) as.Date(as.POSIXct(z, origin="1582-10-14", tz="GMT")) ^^ It should be 15 (Gregorian calendar adoption day, when SPSS starts to count seconds behind dates) . With 14, I used a .sav dataset imported with read.spss, and after as.Date(as.POSIXct()) I got (obviously) R.date = SPSS.date - 1 Bye (and thank you for givin'us R) __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
Wacek Kusnierczyk wrote: > playing with 'names<-', i observed the following: > > x = 1 > names(x) > # NULL > 'names<-'(x, 'foo') > # c(foo=1) > names(x) > # NULL > > where 'names<-' has a functional flavour (does not change x), but: > > x = 1:2 > names(x) > # NULL > 'names<-'(x, 'foo') > # c(foo=1, 2) > names(x) > # "foo" NA > > where 'names<-' seems to perform a side effect on x (destructively > modifies x). furthermore: > > x = c(foo=1) > names(x) > # "foo" > 'names<-'(x, NULL) > names(x) > # NULL > 'names<-'(x, 'bar') > names(x) > # "bar" !!! > > x = c(foo=1) > names(x) > # "foo" > 'names<-'(x, 'bar') > names(x) > # "bar" !!! > > where 'names<-' is not only able to destructively remove names from x, > but also destructively add or modify them (quite unlike in the first > example above). > > analogous code but using 'dimnames<-' on a matrix performs a side effect > on the matrix even if it initially does not have dimnames: > > x = matrix(1,1,1) > dimnames(x) > # NULL > 'dimnames<-'(x, list('foo', 'bar')) > dimnames(x) > # list("foo", "bar") > > this is incoherent with the first example above, in that in both cases > the structure initially has no names or dimnames attribute, but the end > result is different in the two examples. > > is there something i misunderstand here? Only the ideology/pragmatism... In principle, R has call-by-value semantics and a function does not destructively modify its arguments(*), and foo(x)<-bar behaves like x <- "foo<-"(x, bar). HOWEVER, this has obvious performance repercussions (think x <- rnorm(1e7); x[1] <- 0), so we do allow destructive modification by replacement functions, PROVIDED that the x is not used by anything else. On the least suspicion that something else is using the object, a copy of x is made before the modification. So (A) you should not use code like y <- "foo<-"(x, bar) because (B) you cannot (easily) predict whether or not x will be modified destructively (*) unless you mess with match.call() or substitute() and the like. But that's a different story. -- O__ Peter Dalgaard Øster Farimagsgade 5, Entr.B c/ /'_ --- Dept. of Biostatistics PO Box 2099, 1014 Cph. K (*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918 ~~ - (p.dalga...@biostat.ku.dk) FAX: (+45) 35327907 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] r-devel tarball build failure on windows
On 3/10/09, Uwe Ligges wrote: > > Peter Dalgaard wrote: > > > Uwe Ligges wrote: > > > > > Hiroyuki Kawakatsu wrote: > > > > > > > Hi, > > > > > > > > On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I > > > > have cygwin on my path) -make all recommended- for the latest R-devel > > > > tarball (svn revision: 48093) fails when trying to build the > > > > recommended packages: > > > > > > > 1. Have you asked make rsync-recommended before (i.e. are the packages > > > actually there)? No, I did not run make rsync-recommended. I never had to before when building from a tarball. And, yes, the packages were there (both .tar.gz and .tgz but not .ts). But as Peter points out below, the symlinks were probably corrupted. > > > 2. If so, please install the cygwin dlls and try to remove cygwin from > > > your path. The may very well be some version conflicts in I cannot build > > > R / R packages if a full cygwin installation is around. I did install them but they looked identical to the ones I have from cygwin. So I removed them before building. > > > Uwe Ligges > > > > This bit apjaworski last week, but the naughty boy didn't include > > R-devel in the discussion > > > > It boils down to problems with symlink handling. You unpack the tar file > > and the .tgz links look like ordinary files with strange contents to > > other tools. > > > > The workaround is to run > > > > make Rpwd.exe > > make link-recommended > > > > (or, maybe, to unpack with a different tar version, but I really don't > > know). Many thanks for this. Once I run these two, -make all recommended- and -make check- completes as expected (still with cygwin on my path). So when you say "workaround", is there something wrong with my tools or the tarball or ...? > Ah, sure, thanks, I always build from svn sources and hence say > > make rsync-recommended > make recommended > > If you omit > make rsync-recommended > you will need at least > make link-recommended > which is in fact the same but without the rsync step. [...] When I last built R-devel from the tarball on windows (when Rtools29 was identical to Rtools28), -make all recommended- as given in the R-admin manual "just worked". So I did not think of running these additional makes. (The R-admin manual does say to run make link-recommended if you are not using the tarball.) By the way, I noticed that -make all recommended- still builds the CHM help files for certain packages (e.g. Matrix) even though I set USE_CHM=FALSE in MkRules. Is this expected? Thanks for all your help. h. -- +--- | Hiroyuki Kawakatsu | Business School, Dublin City University | Dublin 9, Ireland. Tel +353 (0)1 700 7496 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
Peter Dalgaard wrote: > Wacek Kusnierczyk wrote: > >> playing with 'names<-', i observed the following: >> >> x = 1 >> names(x) >> # NULL >> 'names<-'(x, 'foo') >> # c(foo=1) >> names(x) >> # NULL >> >> where 'names<-' has a functional flavour (does not change x), but: >> >> x = 1:2 >> names(x) >> # NULL >> 'names<-'(x, 'foo') >> # c(foo=1, 2) >> names(x) >> # "foo" NA >> >> where 'names<-' seems to perform a side effect on x (destructively >> modifies x). furthermore: >> >> x = c(foo=1) >> names(x) >> # "foo" >> 'names<-'(x, NULL) >> names(x) >> # NULL >> 'names<-'(x, 'bar') >> names(x) >> # "bar" !!! >> >> x = c(foo=1) >> names(x) >> # "foo" >> 'names<-'(x, 'bar') >> names(x) >> # "bar" !!! >> >> where 'names<-' is not only able to destructively remove names from x, >> but also destructively add or modify them (quite unlike in the first >> example above). >> >> analogous code but using 'dimnames<-' on a matrix performs a side effect >> on the matrix even if it initially does not have dimnames: >> >> x = matrix(1,1,1) >> dimnames(x) >> # NULL >> 'dimnames<-'(x, list('foo', 'bar')) >> dimnames(x) >> # list("foo", "bar") >> >> this is incoherent with the first example above, in that in both cases >> the structure initially has no names or dimnames attribute, but the end >> result is different in the two examples. >> >> is there something i misunderstand here? >> > > Only the ideology/pragmatism... In principle, R has call-by-value > semantics and a function does not destructively modify its arguments(*), > and foo(x)<-bar behaves like x <- "foo<-"(x, bar). HOWEVER, this has > obvious performance repercussions (think x <- rnorm(1e7); x[1] <- 0), so > we do allow destructive modification by replacement functions, PROVIDED > that the x is not used by anything else. On the least suspicion that > something else is using the object, a copy of x is made before the > modification. > > So > > (A) you should not use code like y <- "foo<-"(x, bar) > > because > > (B) you cannot (easily) predict whether or not x will be modified > destructively > > that's fine, thanks, but i must be terribly stupid as i do not see how this explains the examples above. where is the x used by something else in the first example, so that 'names<-'(x, 'foo') does *not* modify x destructively, while it does in the other cases? i just can't see how your explanation fits the examples -- it probably does, but i beg you show it explicitly. thanks. vQ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] ?as.POSIXct (PR#13587)
On Tue, 10 Mar 2009, lbrag...@gmail.com wrote: Full_Name: Luca Braglia Version: 2.8 OS: Windows Submission from: (NULL) (85.18.136.110) From ?as.POSIXct ## SPSS dates (R-help 2006-02-17) z <- c(10485849600, 10477641600, 10561104000, 10562745600) as.Date(as.POSIXct(z, origin="1582-10-14", tz="GMT")) ^^ It should be 15 (Gregorian calendar adoption day, when SPSS starts to count seconds behind dates) . With 14, I used a .sav dataset imported with read.spss, and after as.Date(as.POSIXct()) I got (obviously) R.date = SPSS.date - 1 Hmm, from the SPSS 'Programming and Data Management' guide: 'Internally, dates and date/times are stored as the number of seconds from October 14, 1582, and times are stored as the number of seconds from midnight.' Now, they might just mean the last second of October 14, 1582, but that is not how many other people have read this (including those in the thread mentioned). Wikipedia for example describes October 15, 1582 as the first day of the Gregorian calendar, which makes 1582-10-14 day zero. Given that this is an example only, I don't think we should change it without quite strong evidence that SPSS's documentation is misleading. Bye (and thank you for givin'us R) __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
>> (B) you cannot (easily) predict whether or not x will be modified >> destructively > > that's fine, thanks, but i must be terribly stupid as i do not see how > this explains the examples above. where is the x used by something else > in the first example, so that 'names<-'(x, 'foo') does *not* modify x > destructively, while it does in the other cases? > > i just can't see how your explanation fits the examples -- it probably > does, but i beg you show it explicitly. I think the following shows what Peter was referring to: In this case, there is only one pointer to the value of x: x <- c(1,2) > "names<-"(x,"foo") foo 12 > x foo 12 In this case, there are two: > x <- c(1,2) > y <- x > "names<-"(x,"foo") foo 12 > x [1] 1 2 > y [1] 1 2 It seems as though `names<-` and the like cannot be treated as R functions (which do not modify their arguments) but as special internal routines which do sometimes modify their arguments. -s __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
Stavros Macrakis wrote: >>> (B) you cannot (easily) predict whether or not x will be modified >>> destructively >>> >> that's fine, thanks, but i must be terribly stupid as i do not see how >> this explains the examples above. where is the x used by something else >> in the first example, so that 'names<-'(x, 'foo') does *not* modify x >> destructively, while it does in the other cases? >> >> i just can't see how your explanation fits the examples -- it probably >> does, but i beg you show it explicitly. >> > > I think the following shows what Peter was referring to: > > In this case, there is only one pointer to the value of x: > > x <- c(1,2) > >> "names<-"(x,"foo") >> > foo >12 > >> x >> > foo >12 > > In this case, there are two: > > >> x <- c(1,2) >> y <- x >> "names<-"(x,"foo") >> > foo >12 > >> x >> > [1] 1 2 > >> y >> > [1] 1 2 > that is and was clear to me, but none of my examples was of the second form, and hence i think peter's answer did not answer my question. what's the difference here: x = 1 'names<-'(x, 'foo') names(x) # NULL x = c(foo=1) 'names<-'(x, 'foo') names(x) # "foo" certainly not something like what you show. what's the difference here: x = 1 'names<-'(x, 'foo') names(x) # NULL x = 1:2 'names<-'(x, c('foo', 'bar')) names(x) # "foo" "bar" certainly not something like what you show. > It seems as though `names<-` and the like cannot be treated as R > functions (which do not modify their arguments) but as special > internal routines which do sometimes modify their arguments. > they seem to behave somewhat like macros: 'names<-'(a, b) with the destructive 'names<-' is sort of replaced with a = 'names<-'(a, b) with a functional 'names<-'. but this still does not explain the incoherence above. my problem was and is not that 'names<-' is not a pure function, but that it sometimes is, sometimes is not, without any obvious explanation. that is, i suspect (not claim) that the behaviour is not a design feature, but an incident. vQ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
Peter Dalgaard wrote: > > (*) unless you mess with match.call() or substitute() and the like. But > that's a different story. > different or not, it is a story that happens quite often -- too often, perhaps -- to the degree that one may be tempted to say that the semantics of argument passing in r is a mess. which of course is not true, but since it is possible to mess with match.call & co, people (including r core) do mess with them, and the result is obviously a mess. on top of the clear call-by-need semantics -- and on the surface, you cannot tell how the arguments of a function will be taken (by value? by reference? not at all?), which in effect looks like a messy semantics. vQ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] r-devel tarball build failure on windows
This is another of those things which is not yet finished (you will see mention of the removed cross-building scripts in the relevant Makefile.win). Expect it to work from the tarball before GFF in 10 day's time. There's another intermittent problem with dependencies in the current sources that I will commit a fix for shortly. Geberally, you should not necessarily expect R to build on Windows from snapshots tarballs prior to GFF: use svn+rsync (as that is what the developers use). On Tue, 10 Mar 2009, Hiroyuki Kawakatsu wrote: On 3/10/09, Uwe Ligges wrote: Peter Dalgaard wrote: Uwe Ligges wrote: Hiroyuki Kawakatsu wrote: Hi, On my windows (xp) machine with Rtools29 (excluding cygwin dlls as I have cygwin on my path) -make all recommended- for the latest R-devel tarball (svn revision: 48093) fails when trying to build the recommended packages: 1. Have you asked make rsync-recommended before (i.e. are the packages actually there)? No, I did not run make rsync-recommended. I never had to before when building from a tarball. And, yes, the packages were there (both .tar.gz and .tgz but not .ts). But as Peter points out below, the symlinks were probably corrupted. 2. If so, please install the cygwin dlls and try to remove cygwin from your path. The may very well be some version conflicts in I cannot build R / R packages if a full cygwin installation is around. I did install them but they looked identical to the ones I have from cygwin. So I removed them before building. Uwe Ligges This bit apjaworski last week, but the naughty boy didn't include R-devel in the discussion It boils down to problems with symlink handling. You unpack the tar file and the .tgz links look like ordinary files with strange contents to other tools. The workaround is to run make Rpwd.exe make link-recommended (or, maybe, to unpack with a different tar version, but I really don't know). Many thanks for this. Once I run these two, -make all recommended- and -make check- completes as expected (still with cygwin on my path). So when you say "workaround", is there something wrong with my tools or the tarball or ...? Ah, sure, thanks, I always build from svn sources and hence say make rsync-recommended make recommended If you omit make rsync-recommended you will need at least make link-recommended which is in fact the same but without the rsync step. [...] When I last built R-devel from the tarball on windows (when Rtools29 was identical to Rtools28), -make all recommended- as given in the R-admin manual "just worked". So I did not think of running these additional makes. (The R-admin manual does say to run make link-recommended if you are not using the tarball.) By the way, I noticed that -make all recommended- still builds the CHM help files for certain packages (e.g. Matrix) even though I set USE_CHM=FALSE in MkRules. Is this expected? Thanks for all your help. h. -- +--- | Hiroyuki Kawakatsu | Business School, Dublin City University | Dublin 9, Ireland. Tel +353 (0)1 700 7496 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] logical comparison of functions (PR#13588)
Full_Name: Michael Aaron Karsh Version: 2.8.0 OS: Windows XP Submission from: (NULL) (164.67.71.215) When I try to say if (method==f), where f is a function, it says that the comparison is only possible for list and atomic types. I tried saying if (method!=f), and it gave the same error message. Would it be possible to repair it say that == and != comparisons would be possible for functions? __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] logical comparison of functions (PR#13588)
On 10/03/2009 4:35 PM, michael_ka...@earthlink.net wrote: Full_Name: Michael Aaron Karsh Version: 2.8.0 OS: Windows XP Submission from: (NULL) (164.67.71.215) When I try to say if (method==f), where f is a function, it says that the comparison is only possible for list and atomic types. I tried saying if (method!=f), and it gave the same error message. Would it be possible to repair it say that == and != comparisons would be possible for functions? This is not a bug. Please don't report things as bugs when they aren't. "==" and "!=" are for atomic vectors, as documented. Use identical() for more general comparisons, as documented on the man page for ==. Duncan Murdoch __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] logical comparison of functions (PR#13588)
Duncan Murdoch wrote: > On 10/03/2009 4:35 PM, michael_ka...@earthlink.net wrote: >> Full_Name: Michael Aaron Karsh >> Version: 2.8.0 >> OS: Windows XP >> Submission from: (NULL) (164.67.71.215) >> >> >> When I try to say if (method==f), where f is a function, it says that >> the >> comparison is only possible for list and atomic types. I tried >> saying if >> (method!=f), and it gave the same error message. Would it be >> possible to repair >> it say that == and != comparisons would be possible for functions? > > This is not a bug. Please don't report things as bugs when they > aren't. "==" and "!=" are for atomic vectors, as documented. > > Use identical() for more general comparisons, as documented on the man > page for ==. note that in most programming languages comparing function objects is either not supported or returns false unless you compare a function object to itself. r is a notable exception: identical(function(a) a, function(a) a) # TRUE which would be false in all other languages i know; however, identical(function(a) a, function(b) b) # FALSE though they are surely identical functionally. btw. it's not necessarily intuitive that == works only for atomic vectors. vQ __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] dger_ in BLAS definition
I'm developing some software and running into compiling warning: conditionals.c:104: warning: passing argument 4 of 'dger_' discards qualifiers from pointer target type conditionals.c:104: warning: passing argument 6 of 'dger_' discards qualifiers from pointer target type the netlib documentation states that the arguments x and y should be unchanged on exit. Should should imply the defintion: F77_NAME(dger)(const int * const m, const int * const n, const double * const alpha, double const * const x, const int *const incx, double const * const y, const int *const incy, double * const a, const int * const lda); the current definition is missing the appropriate consts: F77_NAME(dger)(const int *m, const int *n, const double *alpha, double *x, const int *incx, double *y, const int *incy, double *a, const int *lda); I don't want my code compiling with warnings that shouldn't be there. Are there suggestions of how to work around this? Thanks [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] surprising behaviour of names<-
i got an offline response saying that my original post may have not been clear as to what the problem was, essentially, and that i may need to restate it in words, in addition to code. the problem is: the performance of 'names<-' is incoherent, in that in some situations it acts in a functional manner, producing a copy of its argument with the names changed, while in others it changes the object in-place (and returns it), without copying first. your explanation below is of course valid, but does not seem to address the issue. in the examples below, there is always (or so it seems) just one reference to the object. why are the following functional: x = 1; 'names<-'(x, 'foo'); names(x) x = 'foo'; 'names<-'(x, 'foo'); names(x) while these are destructive: x = c(1); 'names<-'(x, 'foo'); names(x) x = c('foo'); 'names<-'(x, 'foo'); names(x) it is claimed that in r a singular value is a one-element vector, and indeed, identical(1, c(1)) # TRUE all.equal(is(1), is(c(1))) # TRUE i also do not understand the difference here: x = c(1); 'names<-'(x, 'foo'); names(x) # "foo" x = c(1); names(x); 'names<-'(x, 'foo'); names(x) # "foo" x = c(1); print(x); 'names<-'(x, 'foo'); names(x) # NULL x = c(1); print(c(x)); 'names<-'(x, 'foo'); names(x) # "foo" does print, but not names, increase the reference count for x when applied to x, but not to c(x)? if the issue is that there is, in those examples where x is left unchanged, an additional reference to x that causes the value of x to be copied, could you please explain how and when this additional reference is created? thanks, vQ Peter Dalgaard wrote: > >> is there something i misunderstand here? >> > > Only the ideology/pragmatism... In principle, R has call-by-value > semantics and a function does not destructively modify its arguments(*), > and foo(x)<-bar behaves like x <- "foo<-"(x, bar). HOWEVER, this has > obvious performance repercussions (think x <- rnorm(1e7); x[1] <- 0), so > we do allow destructive modification by replacement functions, PROVIDED > that the x is not used by anything else. On the least suspicion that > something else is using the object, a copy of x is made before the > modification. > > So > > (A) you should not use code like y <- "foo<-"(x, bar) > > because > > (B) you cannot (easily) predict whether or not x will be modified > destructively > > > (*) unless you mess with match.call() or substitute() and the like. But > that's a different story. > > > -- --- Wacek Kusnierczyk, MD PhD Email: w...@idi.ntnu.no Phone: +47 73591875, +47 72574609 Department of Computer and Information Science (IDI) Faculty of Information Technology, Mathematics and Electrical Engineering (IME) Norwegian University of Science and Technology (NTNU) Sem Saelands vei 7, 7491 Trondheim, Norway Room itv303 Bioinformatics & Gene Regulation Group Department of Cancer Research and Molecular Medicine (IKM) Faculty of Medicine (DMF) Norwegian University of Science and Technology (NTNU) Laboratory Center, Erling Skjalgsons gt. 1, 7030 Trondheim, Norway Room 231.05.060 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] dger_ in BLAS definition
Yes x and y arguments are unchanged on exit, cf. http://www.mathkeisan.com/UsersGuide/man/dger.html This is the work of the R core team to update those files, but I fear there are other functions which are not well declared. Will you agree to take a look at the BLAS.h file? It will be very useful if you have time to do it. (A year ago, I check the lapack.h and there was some wrong declarations like zgecon function.) Christophe Le 10 mars 09 à 22:49, Andrew Redd a écrit : I'm developing some software and running into compiling warning: conditionals.c:104: warning: passing argument 4 of 'dger_' discards qualifiers from pointer target type conditionals.c:104: warning: passing argument 6 of 'dger_' discards qualifiers from pointer target type the netlib documentation states that the arguments x and y should be unchanged on exit. Should should imply the defintion: F77_NAME(dger)(const int * const m, const int * const n, const double * const alpha, double const * const x, const int *const incx, double const * const y, const int *const incy, double * const a, const int * const lda); the current definition is missing the appropriate consts: F77_NAME(dger)(const int *m, const int *n, const double *alpha, double *x, const int *incx, double *y, const int *incy, double *a, const int *lda); I don't want my code compiling with warnings that shouldn't be there. Are there suggestions of how to work around this? Thanks [[alternative HTML version deleted]] __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel -- Christophe Dutang Ph. D. student at ISFA, Lyon, France website: http://dutangc.free.fr __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] S4 generic masking S3 generic when using namespace
Sklyar, Oleg (London) wrote: Try using setGeneric("predict") without further arguments, this should work as it will take the existing 'predict' definition and convert it into S4 generic. This works nicely for me for all plot, print etc methods * R *** R 2.9.0 (svn -r 47821) [/share/research/R-devel/20090203/lib64/R] *** setGeneric("predict") [1] "predict" Great, thanks. -- Gad Abraham MEng Student, Dept. CSSE and NICTA The University of Melbourne Parkville 3010, Victoria, Australia email: gabra...@csse.unimelb.edu.au web: http://www.csse.unimelb.edu.au/~gabraham __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] bug (PR#13570)
Many thanks Brian for tracking this down. Was it fixed by c next line is not in current dloess goto 7 in ehg136? If this needs to be in the netlib version as well, we should inform Eric Grosse. While we're at it, there are a few more inconsistencies (not nearly as serious as PR#13570 so I hesitate to call them bugs) regarding the definition of leaf cell membership (certain .lt. should be .le. ) in ehg128, ehg137, and ehg138 (not currently used); it seems I neglected to mention these to Eric. If you are interested in these I can submit a patch and will notify Eric as well. Finally, perhaps now is as good a time as any to point out that in the documentation, the bit about cross-terms in \item{drop.square}{for fits with more than one predictor and \code{degree=2}, should the quadratic term (and cross-terms) be dropped for particular predictors? is incorrect -- cross terms are not dropped in this implementation of loess. Thanks again, Ben Prof Brian Ripley wrote: I've found the discrepancy, so the patched code from current dloess is now available in R-patched and R-devel. On Fri, 6 Mar 2009, Prof Brian Ripley wrote: On Thu, 5 Mar 2009, Benjamin Tyner wrote: Hi Nice to hear from you Ryan. I also do not have the capability to debug on windows; however, there is a chance that the behavior you are seeing is caused by the following bug noted in my thesis (available on ProQuest; email me if you don't have access): "When lambda = 0 there are no local slopes to aid the blending algorithm, yet the interpolator would still assume they were available, and thus use arbitrary values from memory. This had implications for both fit and tr[L] computation. In the updated code these are set equal to zero which seems the best automatic rule when lambda = 0." [lambda refers to degree] I submitted a bug fix to Eric Grosse, the maintainer of the netlib routines; the fixed lines of fortran are identified in the comments at (just search for my email address): http://www.netlib.org/a/loess These fixes would be relatively simple to incorporate into R's version of loessf.f The fixes from dloess even more simply, since R's code is based on dloess. Thank you for the suggestion. Given how tricky this is to reproduce, I went back to my example under valgrind. If I use the latest dloess code, it crashes, but by selectively importing some of the differences I can get it to work. So it looks as if we are on the road to a solution, but something in the current version (not necessarily in these changes) is incompatible with the current R code and I need to dig further (not for a few days). Alternatively, a quick check would be for someone to compile the source package at https://centauri.stat.purdue.edu:98/loess/loess_0.4-1.tar.gz and test it on windows. Though this package incorporates this and a few other fixes, please be aware that it the routines are converted to C and thus there is a slight performance hit compared to the fortran. Hope this helps, Ben [...] -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] libf95.a: could not read symbols?
I'm sorry for having to post this, but I've run out of ideas. I've been trying to build R-2.8.1 from source for installation on FreeBSD 6.4 (seems to be working fine on osx) and keep getting the same results, regardless of how I set ./configure $ ./configure --enable-R-shlib --with-x=no --with-blas FFLAGS="-fpic" R is now configured for x86_64-unknown-freebsd6.0 Source directory: . Installation directory:/usr/local C compiler:gcc -std=gnu99 -g -O2 Fortran 77 compiler: f95 -fpic C++ compiler: g++ -g -O2 Fortran 90/95 compiler:g95 -g -O2 Obj-C compiler: -g -O2 Interfaces supported: tcltk External libraries:readline Additional capabilities: PNG, JPEG, TIFF, iconv, MBCS, NLS Options enabled: shared R library, shared BLAS, R profiling Recommended packages: yes $ make generates the following results: $ make creating src/scripts/R.fe config.status: creating src/include/config.h config.status: src/include/config.h is unchanged Rmath.h is unchanged gcc -std=gnu99 -shared -L/usr/local/lib -o libRblas.so blas.o cmplxblas.o -L/usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3 -lf95 -lm # xerbla.o /usr/bin/ld: /usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3/libf95.a(ff.o): relocation R_X86_64_32S can not be used when making a shared object; recompile with -fPIC /usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3/libf95.a: could not read symbols: Bad value *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra/blas. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra/blas. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1. you have mail $ Is this is problem with g95, gcc(s) or something I'm not doing correctly? Should I try to build an earlier version instead? I've tried building the g95 freebsd port (not from source) and can't seem to get any traction... any ideas? Thanks. signature.asc Description: OpenPGP digital signature __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [SoC09-Idea] RQuantLib
RQuantLib -- Bridging R and QuantLib Mentor: Dirk Eddelbuettel Summary: The goal of this Summer of Code project is to a) extend the coverage of QuantLib [1] code available to R by adding more wrapper functions to RQuantLib [2], and to b) provide additional functionality to QuantLib by leveraging the numerous statistical facilities in R -- this could be anything from standard to robust estimation methods, data visualization or report creation via tools like Sweave. Required skills: Good R and C++ programming skills. At least some familiarity with basic open source tools like svn, make, ... is beneficial as well. Some understanding of financial economics may be helpful but is not required. Description: QuantLib, the premier free/open-source library for modeling, trading, and risk management, provides a comprehensive software framework for quantitative finance. QuantLib has been developed since Nov 2000 and is now approaching an initial 1.0 release at which point the API will be frozen. This makes it a good point in time to start building more code on top of the API. RQuantLib, first released in 2002 as a proof-of-concept, provides a subset of the available QuantLib functonality. Many more asset classed and methods are now available. This Summer of Code project provides ample scope for a student to first learn about possible extensions to RQuantLib, to learn about interfaces from R to underlying libraries and back, and to then design, architect and implement some meaningful extension. Programming exercise: Take the current RQuantLib package and provide a new function that exposes functionality from QuantLib to R, preferably with a tests/ file and a help file. References: [1] http://www.quantlib.org [2] http://cran.r-project.org/web/packages/RQuantLib/index.html as well as http://r-forge.r-project.org/projects/rquantlib/ as well as http://dirk.eddelbuettel.com/code/rquantlib.html -- Three out of two people have difficulties with fractions. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] bug (PR#13570)
On Tue, 10 Mar 2009, Benjamin Tyner wrote: Many thanks Brian for tracking this down. Was it fixed by c next line is not in current dloess goto 7 in ehg136? If this needs to be in the netlib version as well, we should inform Eric Grosse. The difference was in the argument list of one of the functions (ehg124?). It was 'just' a question of looking at 354 diff sections, not all of which I understood, including that commented above. While we're at it, there are a few more inconsistencies (not nearly as serious as PR#13570 so I hesitate to call them bugs) regarding the definition of leaf cell membership (certain .lt. should be .le. ) in ehg128, ehg137, and ehg138 (not currently used); it seems I neglected to mention these to Eric. If you are interested in these I can submit a patch and will notify Eric as well. Please do let me know and I'll merge in. Finally, perhaps now is as good a time as any to point out that in the documentation, the bit about cross-terms in \item{drop.square}{for fits with more than one predictor and \code{degree=2}, should the quadratic term (and cross-terms) be dropped for particular predictors? is incorrect -- cross terms are not dropped in this implementation of loess. Thanks, I will incorporate that. Thanks again, Ben Prof Brian Ripley wrote: I've found the discrepancy, so the patched code from current dloess is now available in R-patched and R-devel. On Fri, 6 Mar 2009, Prof Brian Ripley wrote: On Thu, 5 Mar 2009, Benjamin Tyner wrote: Hi Nice to hear from you Ryan. I also do not have the capability to debug on windows; however, there is a chance that the behavior you are seeing is caused by the following bug noted in my thesis (available on ProQuest; email me if you don't have access): "When lambda = 0 there are no local slopes to aid the blending algorithm, yet the interpolator would still assume they were available, and thus use arbitrary values from memory. This had implications for both fit and tr[L] computation. In the updated code these are set equal to zero which seems the best automatic rule when lambda = 0." [lambda refers to degree] I submitted a bug fix to Eric Grosse, the maintainer of the netlib routines; the fixed lines of fortran are identified in the comments at (just search for my email address): http://www.netlib.org/a/loess These fixes would be relatively simple to incorporate into R's version of loessf.f The fixes from dloess even more simply, since R's code is based on dloess. Thank you for the suggestion. Given how tricky this is to reproduce, I went back to my example under valgrind. If I use the latest dloess code, it crashes, but by selectively importing some of the differences I can get it to work. So it looks as if we are on the road to a solution, but something in the current version (not necessarily in these changes) is incompatible with the current R code and I need to dig further (not for a few days). Alternatively, a quick check would be for someone to compile the source package at https://centauri.stat.purdue.edu:98/loess/loess_0.4-1.tar.gz and test it on windows. Though this package incorporates this and a few other fixes, please be aware that it the routines are converted to C and thus there is a slight performance hit compared to the fortran. Hope this helps, Ben [...] -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] libf95.a: could not read symbols?
Where did FFLAGS come from here (looks like you meant FPICFLAGS)? But that will only postpone the problem: to build R as a shared library you need PIC libraries, and your Fortran library is apparently not PIC (gcc does not generate PIC code by default on x86_64, and g95 as a gcc derivative is presumably the same). Even if you don't build R as shared library, the problem is likely to crop up in some packages (probably including stats). gcc 4.0.3 is rather old (and rather too early in the gcc4 series). Can you get a later compiler quite (prrferably gcc + gfortran)? On Tue, 10 Mar 2009, Jeff Hamann wrote: I'm sorry for having to post this, but I've run out of ideas. I've been trying to build R-2.8.1 from source for installation on FreeBSD 6.4 (seems to be working fine on osx) and keep getting the same results, regardless of how I set ./configure $ ./configure --enable-R-shlib --with-x=no --with-blas FFLAGS="-fpic" R is now configured for x86_64-unknown-freebsd6.0 Source directory: . Installation directory:/usr/local C compiler:gcc -std=gnu99 -g -O2 Fortran 77 compiler: f95 -fpic C++ compiler: g++ -g -O2 Fortran 90/95 compiler:g95 -g -O2 Obj-C compiler: -g -O2 Interfaces supported: tcltk External libraries:readline Additional capabilities: PNG, JPEG, TIFF, iconv, MBCS, NLS Options enabled: shared R library, shared BLAS, R profiling Recommended packages: yes $ make generates the following results: $ make creating src/scripts/R.fe config.status: creating src/include/config.h config.status: src/include/config.h is unchanged Rmath.h is unchanged gcc -std=gnu99 -shared -L/usr/local/lib -o libRblas.so blas.o cmplxblas.o -L/usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3 -lf95 -lm # xerbla.o /usr/bin/ld: /usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3/libf95.a(ff.o): relocation R_X86_64_32S can not be used when making a shared object; recompile with -fPIC /usr/local/lib/gcc-lib/x86_64-portbld-freebsd6.0/4.0.3/libf95.a: could not read symbols: Bad value *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra/blas. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra/blas. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src/extra. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1/src. *** Error code 1 Stop in /usr/home/hamannj/R-2.8.1. you have mail $ Is this is problem with g95, gcc(s) or something I'm not doing correctly? Should I try to build an earlier version instead? I've tried building the g95 freebsd port (not from source) and can't seem to get any traction... any ideas? Thanks. -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel