[Rd] transpose of complex matrices in R
When one is working with complex matrices, "transpose" very nearly always means *Hermitian* transpose, that is, A[i,j] <- Conj(A[j,i]). One often writes A^* for the Hermitian transpose. I believe that I have actually (many years ago) used a true complex transpose, but I agree that one more often needs the conjugate transpose. I would not be in favour of changing t() because I feel transpose means transpose -- certainly where there are non-square matrices. But I'd be in favour of adding a conjt() (or some similar function) that does the conjugate transpose efficiently. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] iteration count in optim()
Date: Fri, 6 Aug 2010 11:14:37 +0200 From: Christophe Dutang To: r-devel@r-project.org Subject: [Rd] on the optim function Message-ID: <7e004a07-03e1-4ded-a506-6c564edb6...@gmail.com> Content-Type: text/plain; charset=us-ascii Dear useRs, I have just discovered that the R optim function does not return the number of iterations. I still wonder why line 632-634 of optim C, the iter variable is not returned (for the BFGS method for example) ? Is there any trick to compute the iteration number with function call number? Kind regards Christophe -- Christophe Dutang Ph.D. student at ISFA, Lyon, France website: http://dutangc.free.fr For BFGS the number of iterations is the number of gradient evaluations i.e., counts[2] For most of the optimization tools available for R (not just in optim), these counts are a nightmare for developers who want some consistent naming and meaning, as Ravi Varadhan and I can attest in our efforts to build the optimx() wrapper. It can also be debated for some methods what constitutes an "iteration". Best, JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] BLAS benchmarks on R 2.12.0 and related performance issues
This thread pointed out that the "plain vanilla" library for linear algebra outperformed the fancy ones for the original poster -- and he had mentioned this, but still got "you ought to " advice that was inappropriate and ignored his stated experience. I've been surprised sometimes myself with performance results, and I'm now getting wary of making pronouncements. I always thought statistics was about using data sensibly to uncover truths. This is a case where we can actually get the data pretty easily. Perhaps the R posting guide needs an addition to say "When discussing performance, you really should present some data and not just speculation." Ultimately we need good performance benchmarks. They are difficult to set up properly and tedious to run. Maybe a good subject for a Google Summer of Code project for next year or some undergraduate projects. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] DBLEPR?
I normally see digest once per day, but got msg from Doug Bates so responding with context. UCMINF is a package on CRAN that implements a variable metric minimizer. It does quite well on unconstrained problems. Stig Mortensen packaged the Fortran version for R, but is not at moment responding to emails. There's also a Matlab version. We have it in optimx and get some occasions where it just stops if we set trace>0. Other times we can't get it to fail. My guess is something like an undefined variable or a bad declaration of a pointer, but C and C++ are not languages I've much experience with. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] DBLEPR?
We've tried to contact Stig since July. Possibly he changed emails. My thought was to use Rprintf as suggested and was looking into doing that to see if our optimx problems would go away. Will keep it as open issue while we give a bit more time for response, and welcome input from others. JN On 11/16/2010 04:52 PM, Douglas Bates wrote: > On Tue, Nov 16, 2010 at 2:35 PM, Prof. John C Nash wrote: >> I normally see digest once per day, but got msg from Doug Bates so >> responding with context. > >> UCMINF is a package on CRAN that implements a variable metric minimizer. > > A pedant might point out that the package is called "ucminf". > >> It does quite >> well on unconstrained problems. Stig Mortensen packaged the Fortran version >> for R, but is >> not at moment responding to emails. There's also a Matlab version. We have >> it in optimx >> and get some occasions where it just stops if we set trace>0. Other times we >> can't get it >> to fail. My guess is something like an undefined variable or a bad >> declaration of a >> pointer, but C and C++ are not languages I've much experience with. > > Well, it doesn't work well for me because my version of gfortran (GNU > Fortran (Ubuntu/Linaro 4.4.4-14ubuntu5) 4.4.5) objects to the format > strings in some of the Fortran WRITE statements. > > The recommended approach is to avoid all Fortran I/O including writing > to a Fortran character array. As there are only 3 such WRITE > statements in the Fortran code it would be very simple to replace them > with calls to C functions that in turn call Rprintf. However, it > would be best if Stig could take ownership of the modifications. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] DBLEPR?
My reaction is leaning heavily towards "Virtuoso!" as opposed to "Show Off!". Thanks very much. JN On 11/16/2010 05:39 PM, Douglas Bates wrote: > Try this. > > On Tue, Nov 16, 2010 at 4:06 PM, Prof. John C Nash wrote: >> We've tried to contact Stig since July. Possibly he changed emails. >> >> My thought was to use Rprintf as suggested and was looking into doing that >> to see if our >> optimx problems would go away. Will keep it as open issue while we give a >> bit more time >> for response, and welcome input from others. >> >> JN >> >> >> On 11/16/2010 04:52 PM, Douglas Bates wrote: >>> On Tue, Nov 16, 2010 at 2:35 PM, Prof. John C Nash >>> wrote: >>>> I normally see digest once per day, but got msg from Doug Bates so >>>> responding with context. >>> >>>> UCMINF is a package on CRAN that implements a variable metric minimizer. >>> >>> A pedant might point out that the package is called "ucminf". >>> >>>> It does quite >>>> well on unconstrained problems. Stig Mortensen packaged the Fortran >>>> version for R, but is >>>> not at moment responding to emails. There's also a Matlab version. We have >>>> it in optimx >>>> and get some occasions where it just stops if we set trace>0. Other times >>>> we can't get it >>>> to fail. My guess is something like an undefined variable or a bad >>>> declaration of a >>>> pointer, but C and C++ are not languages I've much experience with. >>> >>> Well, it doesn't work well for me because my version of gfortran (GNU >>> Fortran (Ubuntu/Linaro 4.4.4-14ubuntu5) 4.4.5) objects to the format >>> strings in some of the Fortran WRITE statements. >>> >>> The recommended approach is to avoid all Fortran I/O including writing >>> to a Fortran character array. As there are only 3 such WRITE >>> statements in the Fortran code it would be very simple to replace them >>> with calls to C functions that in turn call Rprintf. However, it >>> would be best if Stig could take ownership of the modifications. >> __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] [R] DBLEPR?
Thanks for comments from several folk, fix from Doug Bates and start to finding new email for ucminf maintainer. Summary of responses: DBLEPR and its relations are briefly mentioned but with no or minimal examples in Writing R Extensions as well as Venables and Ripley book. I should have emphasized I was especially seeking examples, and Brian Ripley has pointed out that cluster, mda and tweedie among others use these routines. Fortran WRITE or PRINT routines, even if unused by code, may interfere with C output, especially in Windows. (See R-devel post from Brian Ripley). Doug Bates provided a different approach and code that appears to fix ucminf. We have a lead on contacting the maintainer, who has moved, so hopefully the package can be updated. My conclusion from this: xxxPR routines are still usable, but likely not the best approach for output from Fortran routines. Once again, thanks to all. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Competing with one's own work
No, this is not about Rcpp, but a comment in that overly long discussion raised a question that has been in my mind for a while. This is that one may have work that is used in R in the base functionality and there are improvements that should be incorporated. For me, this concerns the BFGS, Nelder-Mead and CG options of optim(), which are based on the 1990 edition (Pascal codes) of my 1979 book "Compact numerical methods...", which were themselves derived from other people's work. By the time Brian Ripley took that work (with permission, even though not strictly required. Thanks!) there were already some improvements to these same algorithms (mainly bounds and masks) in the BASIC codes of the 1987 book by Mary Walker-Smith and I. However, BASIC to R is not something I'd wish on anyone. Now there are some R packages, including some I've been working on, that do offer improvements on the optim() offerings. I would not say mine are yet fully ready for incorporation into the base, but they are pretty close. Equally I think some of the tools in the base should be deprecated and users encouraged to try other routines. It is also getting more and more important that novice users be provided with sensible guidance and robust default settings and choices. In many areas, users are faced with more choice than is efficient for the majority of problems. My question is: How should such changes be suggested / assisted? It seems to me that this is beyond a simple feature request. Some discussion on pros and cons would be appropriate, and those like myself who are familiar with particular tools can and should offer help. Alternatively, is there a document available in the style "Writing R Extensions" that has a title like "How the R Base Packages are Updated"? A brief search was negative. I'm happy to compete with my own prior work to provide improvements. It would be nice to see some of those improvements become the benchmark for further progress. Best, John Nash __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Google Summer of Code 2011
The 2011 Google Summer of Code will soon be open for organizations to submit potential projects for which students may apply (with detailed plans) for funding. We have some proposals in process at http://rwiki.sciviews.org/doku.php?id=developers:projects:gsoc2011 Note that projects do need to have mentors. Google has so far had a 1 mentor per project policy, but informally there have been support teams. Claudia Beleites and I are also administering the gsoc-r discussion on google groups mentioned at the bottom of the page linked above. Thanks to Dirk E. who has run this up to now and passed the torch to us. John Nash __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Google Summer of Code 2011 - credit where it is due
In my reminder that GSoC project proposals are requested (to R wiki developers' projects page for GSoC 2011), I mentioned that Dirk Eddelbuettel had acted as leader for the R Foundation activity on GSoC prior to handing the torch to Claudia Beleites and I for this year. I should have mentioned that for 2009 Manuel Eugster wore the hat, and in 2008 Friedrich Leisch. GSoC has been running since 2005, but we were not involved for the first three years. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] return(); was Suggestions for "good teaching" packages
I tend to code with return(), at least in development, because I've once stepped in the cowpad of ans<- list() then attr(ans ) and forgot to do another ans so got only part of what I wanted. Perhaps its just my thinking style, but I agree with some others who suggest that it's not such a bad idea to be explicit about what one is doing. I prefer pedestrian code that I can understand easily and quickly fix/modify rather than highly optimized and uncommented brilliance that I cannot reuse. Given the overhead of return(), I'll likely switch to ans # return(ans) to make my programs clear, especially to non-R folk migrating in. I have also been writing optimization functions. Modularizing might be a nice student exercise, as well as avoiding early return()s, but Canada isn't wide enough for all the indents of the else clauses when methods crash at different stages and we want to return a very simple structure with partial data etc. Reminds me of the great "GOTO" debate some 30+ years ago. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Default values in control list of function
In building a function for a package, I'd like to set the defaults in a control list, e.g., makeg<-function(parameters, eps = 1.0e-7, control=list(showwork=TRUE, rubbish=1.0e+101)){ etc. } This does not provide showwork or rubbish within the function if control() is not fully specified. Copying others, I've previously set the control defaults within the function then matched names, and that certainly works. I'm happy to keep using that approach, but would like to have a pointer to good practice for doing this (I'm prepared to write or help write a short tutorial if none exists). I can understand that trying to do the default replacement at a second or greater level could give problems. A search through the language definition and some googling didn't show up any obvious warning about this, but I'm sure there are some comments somewhere. Can anyone suggest some URLs? One gets used to the nicety of being able to specify defaults and forgets that even R has some limits. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] identical(0, -0)
I'll save space and not include previous messages. My 2 cents: At the very least the documentation needs a fix. If it is easy to do, then Ted Harding's suggestion of a switch (default OFF) to check for sign difference would be sensible. I would urge inclusion in the documentation of the +0, -0 example(s) if there is NOT a way in R to distinguish these. There are occasions where it is useful to be able to detect things like this (and NaN and Inf and -Inf etc.). They are usually not of interest to users, but sometimes are needed for developers to check edge effects. For those cases it may be time to consider a package FPIEEE754 or some similar name to allow testing and possibly setting of flags for some of the fancier features. Likely used by just a few of us in extreme situations. Unfortunately, some of the nice exception handling that was suggested in the standard is apparently rarely implemented in compilers. For info, I was actually a voting member of IEEE 754 because I found a nice "feature" in the IBM Fortran G arithmetic, though I never sat in on any of the meetings. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Non-GPL packages for R
Subject: Non-GPL packages for R Packages that are not licensed in a way that permits re-distribution on CRAN are frequently a source of comment and concern on R-help and other lists. A good example of this problem is the Rdonlp2 package that has caused a lot of annoyance for a number of optimization users in R. They are also an issue for efforts like Dirk Eddelbuettel's cran2deb. There are, however, a number of circumstances where non-GPL equivalent packages may be important to users. This can imply that users need to both install an R package and one or more dependencies that must be separately obtained and licensed. One such situation is where a new program is still under development and the license is not clear, as in the recent work we pursued with respect to Mike Powell's BOBYQA. We wanted to verify if this were useful before we considered distribution, and Powell had been offering copies of his code on request. Thus we could experiment, but not redistribute. Recently Powell's approval to redistribute has been obtained. We believe that it is important that non-redistributable codes be excluded from CRAN, but that they could be available on a repository such as r-forge. However, we would like to see a clearer indication of the license status on r-forge. One possibility is an inclusion of a statement and/or icon indicating such status e.g., green for GPL or equivalent, amber for uncertain, red for restricted. Another may be a division of directories, so that GPL-equivalent packages are kept separate from uncertain or restricted licensed ones. We welcome comments and suggestions on both the concept and the technicalities. John Nash & Ravi Varadhan __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Non-GPL packages for R
The responses to my posting yesterday seem to indicate more consensus than I expected: 1) CRAN should be restricted to GPL-equivalent licensed packages 2) r-forge could be left "buyer beware" using DESCRIPTION information 3) We may want a specific repository for restricted packages (RANC?) How to proceed? A short search on Rseek did not turn up a chain of command for CRAN. I'm prepared to help out with documentation etc. to move changes forward. They are not, in my opinion, likely to cause a lot of trouble for most users, and should simplify things over time. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Why license filtering is important (was non-GPL ...)
Using the acronym "GPL" in comments on the management of package repositories led the discussion well away from the issues I wanted to shed light upon, so I have changed the subject tag. Examples of my concerns: 1) I use a package with a non-free component and learn how to work with it efficiently. I'm a retired academic, but still try to recover some costs of the work I do for R and other open projects by doing some contract and training work. A client has a problem and wants a solution. If there is a "non-commercial" restriction, I'm put in the awkward position of having to get a permission -- possibly from a defunct organization or one that has long forgotten how to grant such a permission. 2) There may be new tools available that ARE free software. Dirk's blacklist (perhaps we could have a less loaded name?) may suggest opportunities for gradually moving to packages that can be used without need to check license details. I have used such tasks in the past as student projects where they are relatively straightforward. 3) Many workers are not aware of the consequences of license status of their codes (I was not for some years). The development of CRAN and similar repositories has been and can be a positive force allowing for improvement and understanding of methods. 4) We definitely should retain the ability to access non-free codes -- somehow folk have misread my comments otherwise. I'll use them for learning and when there is no alternative, but if at all possible, I'll choose the free ones for production work so I don't get caught out as above. A comment: In looking at SparseM, I first used the pdf -- it simply says GPL(>=2) as the license. (I'm sure I've got similar bugs in my own docs.) I dug deeper and found the LICENSE file, and also looked at cholesky.f, which is unfortunately one of the bigger files. I was hoping it was somewhat related to work I've done over the years on Cholesky decompositions in the hope I could offer a substitute as per concern (2) above, but it is not of the same nature as the algorithms I worked on as far as I can determine. Maybe someone else is able to step in on that. And a big tip of the hat to Dirk E. and Charles B. for the cran2deb work -- I already admired the work. Now I've started looking at some of the package files for license info, I'm amazed they've got as far as they have. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Buglet in optim() SANN
I think SANN method in optim() is failing to report that it has not converged. Here is an example genrose.f<- function(x, gs=NULL){ # objective function ## One generalization of the Rosenbrock banana valley function (n parameters) n <- length(x) if(is.null(gs)) { gs=100.0 } fval<-1.0 + sum (gs*(x[1:(n-1)]^2 - x[2:n])^2 + (x[2:n] - 1)^2) return(fval) } xx<-rep(pi,10) test<-optim(xx,genrose.f,method="SANN",control=list(maxit=1000,trace=1)) print(test) Output is: > source("tsann.R") sann objective function values initial value 40781.805639 iter 999 value 29.969529 final value 29.969529 sann stopped after 999 iterations $par [1] 1.0135254 0.9886862 1.1348609 1.0798927 1.0327997 1.1087146 1.1642130 [8] 1.3038754 1.8628391 3.7569285 $value [1] 29.96953 $counts function gradient 1000 NA $convergence [1] 0 <-- THIS SHOULD BE 1 ACCORDING TO THE DOCS $message NULL Note terribly important, but maybe fixable. Cheers, John Nash __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Advice on how to arrange fix of buglet
Recently I reported a small bug in optim's SANN method failing to report that it had exceeded the maximum function evaluation limit in the convergence code. This is a small enough matter that I was reluctant to create a full-blown bug report. Indeed in the optimx package Ravi Varadhan and I have been developing on r-forge (under the OptimizeR project) it was a minimal work around to fix the matter in our wrapper that incorporates optim() and a number of other tools. While I don't normally do C code, I could likely figure out a fix for optim too. My query is about how to best get this done without causing a lot of work for others i.e., where to I send patches etc. I expect there are a number of similar issues for different areas of R and its documentation, and a clarification from someone in the core team could streamline things. Maybe the bug system is still the right place? Cheers, JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Buglet in optim() SANN
Indeed Brian is correct about the functioning of SANN and the R documentation. I'd misread the "maxit" warning. Things can stay as they are for now. The rest of this msg is for information and an invitation to off-list discussion. I realize my posting opens up the can of worms about what "convergence" means. As someone who has occasionally published discussions on convergence versus termination, I'd certainly prefer to set the 'convergence' flag to 1 for SANN, since we have only a termination at the maximum number of function evaluations and not necessarily a result that can be presumed to be "optim"al. Or perhaps put a note in the description of the 'convergence' flag to indicate the potential misinterpretation with SANN where results need the user to externally check if they are likely to be usable as an optimum. It may be better to call the non-zero results for "convergence" a "termination indicator" rather than an "error code". Some related packages like ucminf give more than one non-zero indicators for results that are generally usable as optima. They are informational rather than errors. Writing our optimx wrapper for a number of methods has forced us to think about how such information is returned and reported through a flag like "convergence". There are several choices and plenty of room for confusion. Right now a few of us are working on improvements for optimization, but the first goal is to get things working OK for smooth, precisely defined functions. However, we have been looking at methods like SANN for multimodal and noisy (i.e., imprecisely defined) functions. For those problems, knowing when you have an acceptable or usable result is never easy. Comments and exchanges welcome -- off-list of course. Cheers, JN Prof Brian Ripley wrote: > As the posting guide says, please read the help carefully before > posting. It does say: > > ‘maxit’ The maximum number of iterations. Defaults to ‘100’ for > the derivative-based methods, and ‘500’ for ‘"Nelder-Mead"’. > For ‘"SANN"’ ‘maxit’ gives the total number of function > evaluations. There is no other stopping criterion. >^^^^^^^^ > Defaults to ‘1’. > > so this is indicating 'successful convergence' as documented. > > On Tue, 20 Oct 2009, Prof. John C Nash wrote: > >> I think SANN method in optim() is failing to report that it has not >> converged. Here is an example >> >> genrose.f<- function(x, gs=NULL){ # objective function >> ## One generalization of the Rosenbrock banana valley function (n >> parameters) >> n <- length(x) >>if(is.null(gs)) { gs=100.0 } >> fval<-1.0 + sum (gs*(x[1:(n-1)]^2 - x[2:n])^2 + (x[2:n] - 1)^2) >>return(fval) >> } >> >> xx<-rep(pi,10) >> test<-optim(xx,genrose.f,method="SANN",control=list(maxit=1000,trace=1)) >> print(test) >> >> >> Output is: >> >>> source("tsann.R") >> sann objective function values >> initial value 40781.805639 >> iter 999 value 29.969529 >> final value 29.969529 >> sann stopped after 999 iterations >> $par >> [1] 1.0135254 0.9886862 1.1348609 1.0798927 1.0327997 1.1087146 1.1642130 >> [8] 1.3038754 1.8628391 3.7569285 >> >> $value >> [1] 29.96953 >> >> $counts >> function gradient >>1000 NA >> >> $convergence >> [1] 0 <-- THIS SHOULD BE 1 ACCORDING TO THE DOCS > > It _should_ be 0 according to the help page. > >> >> $message >> NULL >> >> Note terribly important, but maybe fixable. >> >> Cheers, >> >> John Nash >> >> __ >> R-devel@r-project.org mailing list >> https://stat.ethz.ch/mailman/listinfo/r-devel >> > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [R] R CMD check when package directory is symlinked
I've done a couple of searches and not found mention of the issue below, though some older scripts mention getting absolute paths for R CMD check. If the issue is "new" or unfamiliar I'll be happy to follow up and document it, but suspect it is in some sense already known and I've missed the right search strategy. The workaround is pretty simple -- copy the files to a new absolute directory. However, if I can save some grief for others, I will. I run Ubuntu Jaunty 9.04, but until recently was running Hardy 8.04. I symlinked my R-optimtest directory from /home/john/ to my "old" home directory's version. When I changed a package and tried R CMD check I got the following. * checking whether the package can be loaded ... ERROR Error in dyn.load(file, DLLpath = DLLpath, ...) : unable to load shared library '/media/lnx804/home/john/R-optimtest/work/minqa.Rcheck/minqa/libs/minqa.so': /media/lnx804/home/john/R-optimtest/work/minqa.Rcheck/minqa/libs/minqa.so: failed to map segment from shared object: Operation not permitted Error : .onLoad failed in 'loadNamespace' for 'minqa' Error: package/namespace load failed for 'minqa' Execution halted I copied the entire minqa tree to a new "localR" directory and things work fine. So it looks like R is unhappy with the expanded directory name i.e., from /home/john/R-optimtest/work/... to /media/lnx804/home/john/R-optimtest/work/... Cheers, JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Idea for install.packages on (certain) linux distributions
Some time ago, I had some email discussion with Dirk E. about putting a front-end on install.packages to first look at the debian repositories for R and use them before trying to install from source. The code for this would not be very large. As Uwe points out in another posting, the issue then becomes one of repository maintenance. And as more types of installers get included, the code and the chance of package mismatches get more risky. However, where we have repositories, it may be useful to have example code to try such an approach. In pseudo-code this could be implemented without damaging install.packages as: [start of my.install.packages] if (exist(local.install) ) { local.install(package) } else { install.packages(package) } If anyone gets enthused about this, I'd suggest posting on R-wiki. Note that local.install will have to be pretty aggressive at checking the OS version etc. JN Date: Mon, 01 Mar 2010 16:33:08 +0100 From: "Carlos J. Gil Bellosta" To: r-devel@r-project.org Subject: [Rd] Idea for install.packages on (certain) linux distributions Message-ID: <4b8bde34.6070...@datanalytics.com> Content-Type: text/plain; charset=ISO-8859-1; format=flowed Hello, I do not know whether this idea would be considered useful or not. Or easy to implement. But it is the case that on certain Linux distributions there are OS packages with precompiled R packages. However, install.packages (and related functions) download the source ones. Would it be possible to add an extra "repository" (or option) on install.packages that would direct R to use the OS level package manager (apt-get, yum or the like) so as to install the precompiled packages from the distribution mirrors instead of the CRAN ones? Best regards, Carlos J. Gil Bellosta http://www.datanalytics.com __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] R CMD check issue with soft-linked directory
I've been having some strange problems with R CMD check in the last couple of days, but now believe I have localized the issue. I had been running Ubuntu Hardy on one drive and then upgraded to Jaunty, but put Jaunty on a different drive. I continue to be able to boot Hardy when I wish. I soft-linked my R working area i.e., /home/john/Rstuff >/media/lnx804/home/john/Rstuff I can still build packages fine, but get j...@nsrv-jaunty:~/jtest$ R CMD check minqa * checking for working pdflatex ... OK * using log directory '/media/store2/jn/test/minqa.Rcheck' * using R version 2.10.1 (2009-12-14) * using session charset: UTF-8 * checking for file 'minqa/DESCRIPTION' ... OK * checking extension type ... Package * this is package 'minqa' version '1.02' * checking package name space information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking for executable files ... OK * checking whether package 'minqa' can be installed ... OK * checking package directory ... OK * checking for portable file names ... OK * checking for sufficient/correct file permissions ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking R files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... ERROR Error in dyn.load(file, DLLpath = DLLpath, ...) : unable to load shared library '/media/store2/jn/test/minqa.Rcheck/minqa/libs/minqa.so': /media/store2/jn/test/minqa.Rcheck/minqa/libs/minqa.so: failed to map segment from shared object: Operation not permitted Error : .onLoad failed in 'loadNamespace' for 'minqa' Error: package/namespace load failed for 'minqa' Execution halted It looks like this package has a loading problem: see the messages for details. The above test was run with a newly created jtest/ directory on yet another drive (partition) to see if there was possibly corruption. However, the issue seems to be one of using a softlink. I would not be upset if R CMD check simply told me that this isn't the right way to do things i.e., don't use the softlinked directory. Does anyone know if this is a recognized issue and is it possible to put some sort of reasonable error message into R CMD check? JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] R CMD check issue with soft-linked directory
Good catch Simon. Changed fstab to force exec on mount of the drive in question and things worked. Thanks, JN Simon Urbanek wrote: On Mar 10, 2010, at 12:43 , Prof. John C Nash wrote: I've been having some strange problems with R CMD check in the last couple of days, but now believe I have localized the issue. I had been running Ubuntu Hardy on one drive and then upgraded to Jaunty, but put Jaunty on a different drive. I continue to be able to boot Hardy when I wish. I soft-linked my R working area i.e., /home/john/Rstuff >/media/lnx804/home/john/Rstuff I can still build packages fine, but get j...@nsrv-jaunty:~/jtest$ R CMD check minqa * checking for working pdflatex ... OK * using log directory '/media/store2/jn/test/minqa.Rcheck' * using R version 2.10.1 (2009-12-14) * using session charset: UTF-8 * checking for file 'minqa/DESCRIPTION' ... OK * checking extension type ... Package * this is package 'minqa' version '1.02' * checking package name space information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking for executable files ... OK * checking whether package 'minqa' can be installed ... OK * checking package directory ... OK * checking for portable file names ... OK * checking for sufficient/correct file permissions ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking R files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... ERROR Error in dyn.load(file, DLLpath = DLLpath, ...) : unable to load shared library '/media/store2/jn/test/minqa.Rcheck/minqa/libs/minqa.so': /media/store2/jn/test/minqa.Rcheck/minqa/libs/minqa.so: failed to map segment from shared object: Operation not permitted Error : .onLoad failed in 'loadNamespace' for 'minqa' Error: package/namespace load failed for 'minqa' Execution halted It looks like this package has a loading problem: see the messages for details. The above test was run with a newly created jtest/ directory on yet another drive (partition) to see if there was possibly corruption. However, the issue seems to be one of using a softlink. I would be very surprised if this was softlink's fault per se. It looks more like a permission issue in your system to me -- the first thing I would check is that your /media/... is not mounted with noexec... Cheers, Simon I would not be upset if R CMD check simply told me that this isn't the right way to do things i.e., don't use the softlinked directory. Does anyone know if this is a recognized issue and is it possible to put some sort of reasonable error message into R CMD check? JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Location of source code for readline()
A few days ago on R-help I asked about a cross-platform timeout version of readline(). Some suggestions, but only partial joy so far. I can get the Gnu bash 'read -t ...' to work in Windows by using the 'bash -c ' construct, but then R's system() function does not seem to allow this to pass through. Similarly a Perl and Free Pascal routine that I tried, the latter being a single executable that did the prompt and the timeout. (I can send code offline if anyone interested -- not fully protected against bad inputs, however.) Now I'm wondering where the code for readline is located in the R source. I've tracked as far as the 'do_readln' in names.c, but now want to find the actual code to see if I can patch it, though I am a real novice in C. Suggestions welcome. My application, FYI, is to have a script that will display something, and wait for a keypress (for readline it seems to need the Enter key) but timeout after a preset number of seconds. The setTimeLimit "almost" works -- but not for readline. I'm thinking of a modified readline like readline(prompt='Do you want to continue?', timeout=6). Note that the issue seems to be Windows. I haven't a Mac to try, but Linux can be made to function by various methods at the top. Sigh. JN __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel