Re: [Rd] Google Summer of Code 2009
On 2/19/09, Dirk Eddelbuettel wrote: [...] > On 19 February 2009 at 09:33, Simon Urbanek wrote: > | If primitive 3d scatterplot interactivity is all you want, go with > | rggobi. It's GTK and has all this already and much more. However, > | ggobi also shows why GTK is not a good choice for general interactive > | graphics toolkit - it [GTK] is slow and lacks reasonable graphics > | support. OpenGL is IMHO a better way to go since IG don't really > | leverage any of the widgets (you get them for free via R widgets > | packages anyway) and OpenGL gives you excellent speed, alpha-support > | and anti-aliasing etc. > > I don't want to turn this into an all-out 'vi versus emacs' slugfest but: > > -- GTk it not the only choice, and I have been very happy with Qt (and Qwt >for a simple yet nice plot widget) on both Linux and Windows; I don't > have >access to a Mac so I didn't test there. > > -- Qt supports OpenGL natively. The demos are very impressive (for OpenGL as >well as the other widgets). > > -- Deepayan has been working on Qt-based code to enhance R, as that appears >to be 'unannounced' I won't post the SVN repo but allow me to state that >the code already ran all (or almost all) examples from the lattice book. Just to expand on that: yes, I have been working on a Qt-based infrastructure, and Michael Lawrence is also involved now, and has been working on refining and optimizing it for more general uses. The details are still in flux, but we hope to have something to show at DSC. Which is not to say that other alternatives wouldn't be good, of course. -Deepayan __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Google Summer of Code 2009
On Fri, 20 Feb 2009, Friedrich Leisch wrote: On Thu, 19 Feb 2009 10:52:19 -0600, Dirk Eddelbuettel (DE) wrote: > [ Cool how nobody cared about Fritz' request not to post ideas yet :) ] Well, I kind of expected that ;-) See also below. > [ I broadly share Oleg's "wouldn't it be nice to have better plot devices" > wish. But I don't think it is a three-month summer target, Yes, that's exactly what came to my mind first The principle applies to some extent to all "wouldn't it be nice if R did..." comments. If something would obviously be a widely appreciated addition to R (such as good interactive graphics), there is probably some good reason that it is hard. It's relatively unlikely that no-one had thought of it or had realized it would be worth having. For ideas like that we are likely to need some way to make the implementation easier (money, code, new approaches to the programming,...). -thomas Thomas Lumley Assoc. Professor, Biostatistics tlum...@u.washington.eduUniversity of Washington, Seattle __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] plot.lm: "Cook's distance" label can overplot point labels
On Thu, 19 Feb 2009, John Fox wrote: Dear John and Brian, My point about colour-blindness was partly tongue-in-cheek, but I think that it's a bad choice to have the second and third colours in the default palette as red and green. Looking at the standard palette with dichromat::dichromat() it seems that it depends on which flavour of red-green anomaly you have. For deuteranopia the red and green are quite close. For protanopia they are pretty distinct and the confusion is between colours 3 and 7 (yellow vs green) and between 4 and 6 (blue and magenta). I agree that the standard palette isn't ideal, though. -thomas Thomas Lumley Assoc. Professor, Biostatistics tlum...@u.washington.eduUniversity of Washington, Seattle __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] unloadNamespace (Was: How to unload a dll loaded via library.dynam()?)
G'day all, On Fri, 20 Feb 2009 04:01:07 + (GMT) Prof Brian Ripley wrote: > library.dynam.unload() does work if the OS is cooperative. And if > you have your package set up correctly and unload the namespace (and > not just detach the package if there is a namespace) then the shared > object/DLL will be unloaded. [...] I guess I have a similar code-install-test development cycle as Alex; and I seem to work on a cooperative OS (Kubuntu 8.04). My set up is that I install packages on which I work into a separate library. To test changes to such packages, I start R in a directory which contains a .Rprofile file which, via .libPaths(), adds the above library to the library path. In this R session I then test the changes. I also used to quit and restart R whenever I re-installed a package with namespace to test the changes made. Somehow I got the impression that this was the way to proceed when namespaces were introduced; and I did not realise until recently that better ways (unloading the namespace) exist. However, I noticed the following behaviour under R 2.8.1 and "R version 2.9.0 Under development (unstable) (2009-02-19 r47958)" which I found surprising: 1) In the running R session, issue the command "unloadNamespace(XXX)" 2) Do changes to the code of the package; e.g. add a "print("hello world")" statement to one of the R functions. 3) Install the new package 4) In the running R session, issue the command "library(XXX)" and call the R function that was changed. Result: "Hello world" is not printed, somehow the old R function is still used. If I issue the commands "unloadNamespace(XXX)" and "library(XXX)" once more then a call to the R function that was changed will print "Hello world"; i.e. the new code is used. If the above sequence is changed to 2), 3) and then 1), then 4) behaves "as expected" and the new R code is used immediately. As far as I can tell, if via the .onUnload() hook the shared object is unloaded via library.dynam.unload(), changes in the C code take effect no matter whether I perform the above steps in the sequence 1-2-3-4 or 2-3-1-4. My preference is to use the sequence 1-2-3-4 since it seems to be the "more logical and cleaner" sequence; and I have vague memories that I managed to crash R in the past after using 2-3 and then trying to quit R. I am wondering why does it make a difference with respect to R code in which order these steps are done but not with respect to compiled code. Well, I guess I understand why the order does not matter for compiled code, but I do not understand why the order matters for R code. I could not find anything in the documentation that would explain this behaviour, or indicate that this is the intended behaviour. Enlightening comments and/or pointers to where this behaviour is documented would be welcome. Cheers, Berwin === Full address = Berwin A TurlachTel.: +65 6516 4416 (secr) Dept of Statistics and Applied Probability+65 6516 6650 (self) Faculty of Science FAX : +65 6872 3919 National University of Singapore 6 Science Drive 2, Blk S16, Level 7 e-mail: sta...@nus.edu.sg Singapore 117546http://www.stat.nus.edu.sg/~statba __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] X11 fails to open (PR#13543)
Full_Name: Merlise Clyde Version: 2.8.1 OS: MAC OS X 10.4.1 Submission from: (NULL) (24.199.155.61) I am running R under X11 on the MAC OS X 10.4.11 and have been having problems with X11 graphics since upgrading to 2.8.+ > plot(1:10) Error in X11(d$display, d$width, d$height, d$pointsize, d$gamma, d$colortype, : unable to start device X11cairo In addition: Warning messages: 1: In function (display = "", width, height, pointsize, gamma, bg, : X11 protocol error: BadValue (integer parameter out of range for operation) 2: In function (display = "", width, height, pointsize, gamma, bg, : cairo error 'out of memory' > I run R under emacs so the display is being set correctly and the device would open under previous verions. I am also encoutering a related problem with running R 2.8.0/2.8.1 on a remote unix box and displaying back to the MAC, where any attempts to open an X11() device causes R to hang. Thanks, Merlise Clyde --please do not edit the information below-- Version: platform = i386-apple-darwin8.11.1 arch = i386 os = darwin8.11.1 system = i386, darwin8.11.1 status = major = 2 minor = 8.1 year = 2008 month = 12 day = 22 svn rev = 47281 language = R version.string = R version 2.8.1 (2008-12-22) Locale: C Search Path: .GlobalEnv, package:stats, package:graphics, package:grDevices, package:utils, package:datasets, package:methods, Autoloads, package:base __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] "source" fails to handle non-ascii variable names (PR#13541)
If there is variable name in the source file which contains non-ascii cha= racters, "source" gives an=20 "unexpected $end" error after first such character (even if proper file e= ncoding is provided). This also happens with parse, when "file" is a textConnection, but not if= the same code is provided=20 by "text" argument. This problem seems to occur only on Windows. Examples: > con <- textConnection("=C4=85=C4=99=C4=87=C5=BA <- 12345") > parse(con) Error in parse(con) : unexpected $end at 1: =C4=85 # backtick quoted names are parsed correctly: > con <- textConnection("`=C4=85=C4=99=C4=87=C5=BA` <- 12345") > parse(con) expression(=C4=85=C4=99=C4=87=C5=BA <- 12345) # also, parsing as text works: > con <- textConnection("=C4=85=C4=99=C4=87=C5=BA <- 12345") > parse(text=3DreadLines(con)) expression(=C4=85=C4=99=C4=87=C5=BA <- 12345) attr(,"srcfile") Version: platform =3D i386-pc-mingw32 arch =3D i386 os =3D mingw32 system =3D i386, mingw32 status =3D major =3D 2 minor =3D 8.1 year =3D 2008 month =3D 12 day =3D 22 svn rev =3D 47281 language =3D R version.string =3D R version 2.8.1 (2008-12-22) Windows XP (build 2600) Service Pack 3 Locale: LC_COLLATE=3DPolish_Poland.1250;LC_CTYPE=3DPolish_Poland.1250;LC_MONETARY= =3DPolish_Poland.1250;LC_NUMERIC=3DC;LC_TIME=3DPolish_Poland.1250 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Faster Blas Library
Hi everyone, I have made a faster BLAS library thanks to Nvidia CUBLAS library. I was wondering how I could upload this new Rblas.dll. I've included a powerpoint presentation I made on the project. Highlights include upto 2000% improvement in matrix multiplication timings. Unfortunately the link included in the presentation is only accessible by CSIRO employees only. I will gladly include the source code as well. However for the moment could someone simply tell me where and how to upload these files Thanks Sachin http://www.nabble.com/file/p22114095/R3.ppt R3.ppt -- View this message in context: http://www.nabble.com/Faster-Blas-Library-tp22114095p22114095.html Sent from the R devel mailing list archive at Nabble.com. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Faster Blas Library
I presume this is on Windows (you did not actually say). The section http://cran.r-project.org/bin/windows/contrib/ is managed by Uwe Ligges, and you could send him the Rblas.dll, the sources, a description file and a license (the last being rather important if this is to be hosted on CRAN). On Thu, 19 Feb 2009, sachin1234 wrote: Hi everyone, I have made a faster BLAS library thanks to Nvidia CUBLAS library. I was wondering how I could upload this new Rblas.dll. I've included a powerpoint presentation I made on the project. Highlights include upto 2000% improvement in matrix multiplication timings. Unfortunately the link included in the presentation is only accessible by CSIRO employees only. I will gladly include the source code as well. However for the moment could someone simply tell me where and how to upload these files Thanks Sachin http://www.nabble.com/file/p22114095/R3.ppt R3.ppt -- View this message in context: http://www.nabble.com/Faster-Blas-Library-tp22114095p22114095.html Sent from the R devel mailing list archive at Nabble.com. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Faster Blas Library
Prof Brian Ripley wrote: I presume this is on Windows (you did not actually say). The section http://cran.r-project.org/bin/windows/contrib/ is managed by Uwe Ligges, and you could send him the Rblas.dll, the sources, a description file and a license (the last being rather important if this is to be hosted on CRAN). Yes, please do so. We are really interested. Note: I doubt that double prec. floating point ops are faster than single prec. (according to your slides). Moreover, I'd like to see speed comparisons related to matrix sizes on some logarithmic scale in order to compare the quite relevant gains for small sized vectors. Best wishes, Uwe On Thu, 19 Feb 2009, sachin1234 wrote: Hi everyone, I have made a faster BLAS library thanks to Nvidia CUBLAS library. I was wondering how I could upload this new Rblas.dll. I've included a powerpoint presentation I made on the project. Highlights include upto 2000% improvement in matrix multiplication timings. Unfortunately the link included in the presentation is only accessible by CSIRO employees only. I will gladly include the source code as well. However for the moment could someone simply tell me where and how to upload these files Thanks Sachin http://www.nabble.com/file/p22114095/R3.ppt R3.ppt -- View this message in context: http://www.nabble.com/Faster-Blas-Library-tp22114095p22114095.html Sent from the R devel mailing list archive at Nabble.com. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] unloadNamespace (Was: How to unload a dll loaded via library.dynam()?)
This was rather a large shift of subject, so I've pruned the recipients list. Is lazy loading involved? If so I have an idea that may or may not be relevant. We do cache in memory the lazy-loading database for speed on slow (network-mounted or USB drive) file systems. Now the cache is flushed at least if you do detach(foo, unload = TRUE) or but I can envisage a set of circumstances in which it might not be. So perhaps try detach(foo, unload = TRUE) or not using lazy-loading when developing the package? On Fri, 20 Feb 2009, Berwin A Turlach wrote: G'day all, On Fri, 20 Feb 2009 04:01:07 + (GMT) Prof Brian Ripley wrote: library.dynam.unload() does work if the OS is cooperative. And if you have your package set up correctly and unload the namespace (and not just detach the package if there is a namespace) then the shared object/DLL will be unloaded. [...] I guess I have a similar code-install-test development cycle as Alex; and I seem to work on a cooperative OS (Kubuntu 8.04). My set up is that I install packages on which I work into a separate library. To test changes to such packages, I start R in a directory which contains a .Rprofile file which, via .libPaths(), adds the above library to the library path. In this R session I then test the changes. I also used to quit and restart R whenever I re-installed a package with namespace to test the changes made. Somehow I got the impression that this was the way to proceed when namespaces were introduced; and I did not realise until recently that better ways (unloading the namespace) exist. However, I noticed the following behaviour under R 2.8.1 and "R version 2.9.0 Under development (unstable) (2009-02-19 r47958)" which I found surprising: 1) In the running R session, issue the command "unloadNamespace(XXX)" 2) Do changes to the code of the package; e.g. add a "print("hello world")" statement to one of the R functions. 3) Install the new package 4) In the running R session, issue the command "library(XXX)" and call the R function that was changed. Result: "Hello world" is not printed, somehow the old R function is still used. If I issue the commands "unloadNamespace(XXX)" and "library(XXX)" once more then a call to the R function that was changed will print "Hello world"; i.e. the new code is used. If the above sequence is changed to 2), 3) and then 1), then 4) behaves "as expected" and the new R code is used immediately. As far as I can tell, if via the .onUnload() hook the shared object is unloaded via library.dynam.unload(), changes in the C code take effect no matter whether I perform the above steps in the sequence 1-2-3-4 or 2-3-1-4. My preference is to use the sequence 1-2-3-4 since it seems to be the "more logical and cleaner" sequence; and I have vague memories that I managed to crash R in the past after using 2-3 and then trying to quit R. I am wondering why does it make a difference with respect to R code in which order these steps are done but not with respect to compiled code. Well, I guess I understand why the order does not matter for compiled code, but I do not understand why the order matters for R code. I could not find anything in the documentation that would explain this behaviour, or indicate that this is the intended behaviour. Enlightening comments and/or pointers to where this behaviour is documented would be welcome. Cheers, Berwin === Full address = Berwin A TurlachTel.: +65 6516 4416 (secr) Dept of Statistics and Applied Probability+65 6516 6650 (self) Faculty of Science FAX : +65 6872 3919 National University of Singapore 6 Science Drive 2, Blk S16, Level 7 e-mail: sta...@nus.edu.sg Singapore 117546http://www.stat.nus.edu.sg/~statba -- Brian D. Ripley, rip...@stats.ox.ac.uk Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UKFax: +44 1865 272595 __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] cpu bound cases
On Thu, 19 Feb 2009, Rubin, Norman wrote: I'm considering some things AMD could do to accelerate R using GPU processors. In an internal discussion I was asked "Are there interesting R computations which are currently cpu bound?" I'm sure there are lots but I'd like to be able to name some real world cases. It depends a bit on what you mean by cpu-bound. Some of the arithmetic and mathematical functions are fairly clearly cpu bound, since Luke Tierney's multithreaded math library speeds them up. The matrix operations in regression can easily push the CPU usage to 100%, but the success of ATLAS suggests that they may really be limited more by cpu memory bandwidth. I don't know if this counts. Other people may have different suggestions. -thomas Thomas Lumley Assoc. Professor, Biostatistics tlum...@u.washington.eduUniversity of Washington, Seattle __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] plot.lm: "Cook's distance" label can overplot point labels
Dear Thomas, Though far from an expert on the matter, it's my understanding that red-green confusion is the most common form of colour-blindness. I guess that the best way to put it is that it would be desirable to choose colours for the standard palette that minimize the probability of perceptual problems. Regards, John > -Original Message- > From: Thomas Lumley [mailto:tlum...@u.washington.edu] > Sent: February-20-09 4:32 AM > To: John Fox > Cc: 'John Maindonald'; 'Prof Brian Ripley'; r-devel@r-project.org; 'Martin > Maechler' > Subject: Re: [Rd] plot.lm: "Cook's distance" label can overplot point labels > > On Thu, 19 Feb 2009, John Fox wrote: > > > Dear John and Brian, > > > > My point about colour-blindness was partly tongue-in-cheek, but I think > that > > it's a bad choice to have the second and third colours in the default > > palette as red and green. > > > > Looking at the standard palette with dichromat::dichromat() it seems that it > depends on which flavour of red-green anomaly you have. For deuteranopia the > red and green are quite close. For protanopia they are pretty distinct and > the confusion is between colours 3 and 7 (yellow vs green) and between 4 and > 6 (blue and magenta). > > I agree that the standard palette isn't ideal, though. > > -thomas > > Thomas Lumley Assoc. Professor, Biostatistics > tlum...@u.washington.edu University of Washington, Seattle > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] [SoC09-Info] Idea submission.
Hi everybody, as Fritz mentioned in his introducing "Google Summer of Code 2009" email, I will manage the organizational part of the R-Project application and (hopfully) participation. Google's timeline schedules March 9-13 as date for organizations to make an application as mentoring organization. The idea is now to collect as many project ideas in a brainstorming phase and submit these by March 10. A project proposal consists of: (1) a short description, (2) a detailed description, (3) required skills and (4) the mentors name. I propose, that for each idea, a (5) short programming exercise is defined, which students have to solve before they can apply. Many other projects do this to reduce "noise". It also allows an evaluation and ranking if more than one student applies to the same project. But this is up to the mentor and hence is optional to include. I thus encourage you to send such project proposals to me with a CC to the r-devel list with [SoC09-Idea] as start of the subject line. I will collect the ideas on a tentative list, see http://www.r-project.org/soc09. The r-devel CC allows an active discussion (as we already saw :-)). Just to clarify: in this brainstorming phase, we collect ideas and there is no need to rank or evaluate them! I will monitor the different Google SoC information sources and mail important informations with [SoC09-Info] as start of the subject line to this list to keep you updated. So, kick-off for idea proposals until March 10! Best, Manuel. __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] plot.lm: "Cook's distance" label can overplot point labels
At 13:05 20/02/2009, John Fox wrote: Dear Thomas, Though far from an expert on the matter, it's my understanding that red-green confusion is the most common form of colour-blindness. I guess that the best way to put it is that it would be desirable to choose colours for the standard palette that minimize the probability of perceptual problems. I wonder whether there are two separate issues: what is the best standard palette and whether mainstream plots should use colour to carry essential information. For instance there seems little problem in biplot having red arrows and black points because the colour is redundant. I am not an expert on colour vision either but there certainly are people who report difficulty at scientific meetings with interpreting the slides. Regards, John > -Original Message- > From: Thomas Lumley [mailto:tlum...@u.washington.edu] > Sent: February-20-09 4:32 AM > To: John Fox > Cc: 'John Maindonald'; 'Prof Brian Ripley'; r-devel@r-project.org; 'Martin > Maechler' > Subject: Re: [Rd] plot.lm: "Cook's distance" label can overplot point labels > > On Thu, 19 Feb 2009, John Fox wrote: > > > Dear John and Brian, > > > > My point about colour-blindness was partly tongue-in-cheek, but I think > that > > it's a bad choice to have the second and third colours in the default > > palette as red and green. > > > > Looking at the standard palette with dichromat::dichromat() it seems that it > depends on which flavour of red-green anomaly you have. For deuteranopia the > red and green are quite close. For protanopia they are pretty distinct and > the confusion is between colours 3 and 7 (yellow vs green) and between 4 and > 6 (blue and magenta). > > I agree that the standard palette isn't ideal, though. > > -thomas > > Thomas Lumley Assoc. Professor, Biostatistics > tlum...@u.washington.edu University of Washington, Seattle > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel No virus found in this incoming message. Checked by AVG - www.avg.com Version: 8.0.237 / Virus Database: 270.11.1/1961 - Release Date: 02/19/09 18:45:00 Michael Dewey http://www.aghmed.fsnet.co.uk __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] unloadNamespace (Was: How to unload a dll loaded via library.dynam()?)
G'day Brian, On Fri, 20 Feb 2009 11:37:18 + (GMT) Prof Brian Ripley wrote: > This was rather a large shift of subject, [...] Well, yes, from the clean unloading of compiled code to the clean unloading of R code. :-) Though, I also confirmed that the former is possible on a cooperative OS when library.dynam.unload() is correctly used via an .onUnload() hook. > Is lazy loading involved? Yes, the DESCRIPTION file has the default "LazyLoad: yes" entry. If I set "LazyLoad: no", then both sequences use the new version of the R code immediately. > If so I have an idea that may or may not be relevant. We do cache in > memory the lazy-loading database for speed on slow (network-mounted > or USB drive) file systems. Now the cache is flushed at least if you > do detach(foo, unload = TRUE) or but I can envisage a set of > circumstances in which it might not be. As far as I can tell, "detach(foo, unload=TRUE)" and "unloadNamespace(foo)" behave identical on my machines (while the DESCRIPTION file has "LazyLoad: yes"); the modified R code is only used if either of this command is given (followed by "library(foo)") after the new version of the package was installed. > So perhaps try detach(foo, unload = TRUE) or not using lazy-loading > when developing the package? Unfortunately, the former does not work and although the latter works I am hesitant to use it since: a) as I understand it, most packages that use S4 methods need lazy-loading (though the particular package with which I noticed the behaviour does not have S4 methods); and b) it seems that these days the DESCRIPTION file is the only way of switching lazy loading on and off and that there is no way of overriding that value. Knowing myself, I would forget changing the DESCRIPTION file back to "LazyLoad: yes" before building the .tar.gz file for distribution (once the package is ready). As it is, I have already to remember to take "-Wall -pedantic" out of the Makevars file in the src/ directory; but I am reminded of that by R CMD check. Well, thinking a bit more about b), I could probably complicate my Makefile a bit more such that a "make install" first modifies the DESCRIPTION file to "LazyLoad: no" before installing the package to the local library and that "make build" first modifies the DESCRIPTION in the opposite way. But this would still leave concern a). Cheers, Berwin __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] vignette compilation times
G'day Robin, On Thu, 19 Feb 2009 11:10:45 + Robin Hankin wrote: > I am preparing a number of vignettes that require a very long time to > process with Sweave. The longest one takes 10 hours. Is the sum of all chunks taking this time? Or is it mostly the code in only a view chunks? And if so, are there chunks following that depend on the result of these time-intensive chunks? I wonder if it is feasible to construct your vignette according to the lines. 1) have a file, say, vignette1.Rnw.in that contains: #ifdef BUILDVERSION you may want to try the commands \begin{Sinput} > command1 > command2 \end{Sinput} but be aware that his might take a long time. #else Now we run the commands <<>>= command1 command2 @ #endif -- 2) Now construct a Makefile that, using a preprocesser like cpp, produces vignette1.Rnw from vignette1.Rnw.in using the first version before an "R CMD build" but otherwise (for your own testing) the second version. Using .Rbuildignore, you can ensure that vignett1.Rnw.in would not be distributed. > I love the weaver package! Thanks for pointing this package out. I was aware of cacheSweave, but that package seems to require that each chunk has a label which I find kind of inconvenient. weaver does not seem to have such a requirement. Cheers, Berwin __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] vignette compilation times
G'day Gabor, On Thu, 19 Feb 2009 17:47:53 -0500 Gabor Grothendieck wrote: > [...] > Unless this has changed recently,I've tried including a PDF but it > does not appear in library(help = myPackage) nor on the CRAN site on > http://cran.r-project.org/package=myPackage > while Sweave'd PDFs do. If you want a PDF file to appear in library(help=myPackage), then you can write a vignette that just includes that PDF file via \includepdf from the LaTeX package(?) pdfpages. You will, of course, end up with two PDF files that are practically identical. So you might want to exclude the original PDF file from the build package via .Rbuildignore. If you do so, the next problem is that since R 2.6.0 "R CMD check" is trying to latex the vignette and not just checks the code in the vignette. And in current TeX systems latex will hang if \includepdf does not find the specified PDF file; latex does not stop with an error, it hangs. So the vignette has to be written smart enough to try to include the PDF file via \includepdf only if the file really exists, but that can easily be done. See the package lasso2 for an example. If you follow this set up, your PDF file will show up in library(help=myPackage) and your package will pass "R CMD check" on CRAN. HTH. Cheers, Berwin __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] vignette compilation times
G'day Fritz, On Fri, 20 Feb 2009 12:46:49 +1100 Friedrich Leisch wrote: [...] > It is also unclear to me whether including a PDF without sources in a > GPLed package isn't a violation of the GPL (I know people who very > strongly think so). And source according to the GPL means "the > preferred form of the work for making modifications to it." So for a > PDF showing R output that would mean the text plus R code plus > data ... which boils down to XXXweave anyway. Well, GPL-2 says "This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License". I am somehow unable to locate the equivalent statement in GPL-3. Thus, under GPL-2, if the source that produces the PDF file does not contain a statement that it may be distributed under the terms of the GPL, then, in my understanding, you do not have to distribute the source. On occasions I wondered whether stating in the DESCRIPTION file that your package is GPL-2 extends this license to all other files and to those in inst/doc in particular. Or whether one should better slap a GPL-2 notice (or a GNU Free Documentation License) explicitly on the documentation. Actually, the fact that the GNU Free Documentation License exists makes me wonder whether it is tenable to apply GPL to documentation such as PDF files. But the phrase "or other work" in the above cite part of GPL-2 and the explicit `"The Program" refers to any copyrightable work' in GPL-3 seem to indicate that it is possible. Though, I guess you would still have to state *within* the (source of) vignette that it is under the GPL. But then IANAL. Cheers, Berwin __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] interactive graphics for R: was Google Summer of Code 2009
On Thu, Feb 19, 2009 at 9:27 AM, Sklyar, Oleg (London) < oskl...@maninvestments.com> wrote: > Dear Simon, > > thanks for comments. > > I better give a bit of a background first. We are analysing time series of > financial data, often multivariate and with say 200K samples. It is quite a > frequent situation that one needs to display multivariate time series of say > 200K rows and 10 columns over the whole time range and be able to zoom in to > look for effects of interest. The obvious choice of plots is a multiplot > window with a shared x-axis, in this case time, zooming should be done > simultaneously in all time series displayed. > > I do understand this is a very specific example, but I am sure similar > problems arise in other discilines: think of a genomic browser, sequencing > or any other non-financial time series data etc. > > Essentially, no matter what the graphying or rendering technology used > beneath (GTK, QT or anything else), my requirements, and yes they are in a > way subjective, but on the other hand quite generic, would be a possibliity > to produce multiplot windows (similar to say setting mfrow in par) with two > simple features: zooming and panning simultaneously on all plots or > independently. The support for Axis/pretty method callbacks is required > because those are the methods that provide correct axis labeling > independently on the class of the data. This is essentially the only thing > that is not supported by the gtkdatabox widget as the rulers can only > display numbers. > > On the other issues of interactivity, I agree it is quite a broad term, but > the functionality I describe above is pretty much basic. > > As for Java objections: this is not because Java is slow on its own, but > the interface is not native, requires a huge JVM for a fairly simple task > and the interface is relatively slow and cumbersome. As soon as I see a > package demonstrating good performance via rJava, I will be happy to say I > was wrong. But essentially the same problem with 'playwith' package > mentioned earlier -- it uses RGtk, gWidgets and therefore it is slow -- it > is not that GTK is slow, but the complex binding from R via RGtk to GTK. If > used natively, it is very fast. > I am not sure what you mean by complex, but RGtk2 is a pretty thin layer over GTK+. The inefficiency is in the design of the GTK+ API, which RGtk2 just wraps verbatim. The same inefficiency exists at the C level, but it's less pronounced. Gdk is simply not designed for high-performance graphics. It's been alright with GGobi, but on Windows though Gdk is very slow and GGobi is almost unusable even on relatively modest datasets (microarray scale). As Simon said, OpenGL is the way to go for pushing pixels to the screen. Gives you all the low-level control, hardware acceleration, and little overhead. More at the DSC. > > > As for iPlots, the development has shifted a while ago from > > the 'old' > > iPlots to the new ones which are in development stage (as I > > said they > > are announced for the useR! conference). My point was not about > > telling you to use a specific software, it was rather about > > making you > > aware of the fact that what you describe already exists (ggobi > > definitely is IG in GTK) and/or is worked on (iPlots 3.0) with > > possibly better approach. > > Where can I find it to have a look? No matter that it is in development, if > it fits the needs, I will only be happy to contribute what I can. > > > > > > 3) I have a prototype using gtkdatabox for very fast interactive > > > plots in R using GTK, but it is limited by the capabilities of the > > > gtkdatabox widget, not that of R or GTK as such. > > > > > > > I don't know about your prototype, so I cannot really comment > > on that, > > but gtkdatabox is not IG, either. > > > > I cannot send you an example of an R package using gtkdatabox from the > office, but I will create a small demo pack at home and will send it to you > separately as to indicate what I am looking into. Possibly it is not IG, but > this is essentially what I described above, although quite primitive (but it > was a one-day project for me, not 3-months). > > > > > > I do think there is a need for an interactive graphics > > package for R. > > > > > > > I do completely agree with that, but interactive means it satisfies > > basic requirements on IG such as the availability of selection, > > highlighting, queries, interactive change of parameters etc. This is > > not about 2d/3d clouds at all - that we have for decades > > already. Also > > this is not about "hacks" to glue on interactivity to existing > > graphics systems with a chewing gum. We need a versatile (possible > > extensible) set of interactive statistical plots -- at least that's > > what our experience shows. > > Agree completely. > > > > > Cheers, > > Simon > > > > > > > > > >> -Original Message- > > >> From: Simon Urbanek [mailto:simon.urba...@r-project.org] > > >> Sent: 19 February 2009 14:34
Re: [Rd] vignette compilation times
Thanks for the inventive workaround. On Fri, Feb 20, 2009 at 1:37 PM, Berwin A Turlach wrote: > G'day Gabor, > > On Thu, 19 Feb 2009 17:47:53 -0500 > Gabor Grothendieck wrote: > >> [...] >> Unless this has changed recently,I've tried including a PDF but it >> does not appear in library(help = myPackage) nor on the CRAN site on >> http://cran.r-project.org/package=myPackage >> while Sweave'd PDFs do. > > If you want a PDF file to appear in library(help=myPackage), then you > can write a vignette that just includes that PDF file via \includepdf > from the LaTeX package(?) pdfpages. > > You will, of course, end up with two PDF files that are practically > identical. So you might want to exclude the original PDF file from the > build package via .Rbuildignore. > > If you do so, the next problem is that since R 2.6.0 "R CMD check" is > trying to latex the vignette and not just checks the code in the > vignette. And in current TeX systems latex will hang if \includepdf > does not find the specified PDF file; latex does not stop with an > error, it hangs. > > So the vignette has to be written smart enough to try to include the > PDF file via \includepdf only if the file really exists, but that can > easily be done. See the package lasso2 for an example. > > If you follow this set up, your PDF file will show up in > library(help=myPackage) and your package will pass "R CMD check" on > CRAN. > > HTH. > > Cheers, > >Berwin > __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] R-devel/Linux x64/Sun Studio 12: Problem with Matrix
Dear Developers, motivated by the new Sun Studio checks I compiled R-devel and several of our packages with Sun Studio 12 on Fedora x64. Everything worked fine and R-devel runs, with the exception of package Matrix where compilation crashes with the following message. The error occurs during building of the recommended packages and also if Matrix is compiled separately: [...] CC -G -lCstd -L/opt/sun/sunstudio12/lib/amd64 -o Matrix.so CHMfactor.o Csparse.o TMatrix_as.o Tsparse.o init.o Mutils.o chm_common.o cs.o cs_utils.o dense.o dgCMatrix.o dgTMatrix.o dgeMatrix.o dpoMatrix.o dppMatrix.o dsCMatrix.o dsyMatrix.o dspMatrix.o dtCMatrix.o dtTMatrix.o dtrMatrix.o dtpMatrix.o factorizations.o ldense.o lgCMatrix.o sparseQR.o CHOLMOD.a COLAMD.a AMD.a -L/home/user/R/R-devel/lib -lRlapack -L/home/user/R/R-devel/lib -lRblas -R/opt/sun/sunstudio12/lib/amd64:/opt/sun/sunstudio12/lib/amd64:/opt/sun/lib/rtlibs/amd64:/opt/sun/lib/rtlibs/amd64 -L/opt/sun/sunstudio12/rtlibs/amd64 -L/opt/sun/sunstudio12/prod/lib/amd64 -lfui -lfai -lfsu -lmtsk -lpthread -lm /opt/sun/sunstudio12/prod/lib/amd64/libc_supp.a /lib64/libpthread.so.0: file not recognized: File format not recognized make: *** [Matrix.so] Error 1 ERROR: compilation failed for package ‘Matrix’ * Removing ‘/home/user/R/R-devel/library/Matrix’ Can someone help me or give me a pointer what I'm making wrong? How can I get/include the missing shared library? Many thanks in advance Thomas Petzoldt #file: config.site CC=cc CFLAGS="-xO5 -xc99 -xlibmil -nofstore" CPICFLAGS=-Kpic F77=f95 FFLAGS="-O5 -libmil -nofstore" FPICFLAGS=-Kpic CXX=CC CXXFLAGS="-xO5 -xlibmil -nofstore" CXXPICFLAGS=-Kpic FC=f95 FCFLAGS=$FFLAGS FCPICFLAGS=-Kpic LDFLAGS=-L/opt/sun/sunstudio12/lib/amd64 SHLIB_LDFLAGS=-shared SHLIB_CXXLDFLAGS="-G -lCstd" SHLIB_FCLDFLAGS=-G SAFE_FFLAGS="-O5 -libmil" platform 86_64-unknown-linux-gnu arch x86_64 os linux-gnu system x86_64, linux-gnu status Under development (unstable) major 2 minor 9.0 year 2009 month 02 day 20 svn rev 47964 language R version.string R version 2.9.0 Under development (unstable) (2009-02-20 r47964) __ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel