Re: [Rd] Including a binary Python Interpreter into a binary R-package for MS Windows

2009-09-04 Thread Uwe Ligges



Guido van Steen wrote:

Sorry: I sent this email to r-de...@r-project.com. So that it got bounced.

Hi Uwe, 

Thanks a lot for this answer. 


Don't know Python on Windows so well, but why can't they
install it? You can also install R with limited user
privileges.


The last time I worked with Python on MS Windows was about 10 years ago. From that time I remember that you needed administrative privileges to install Python (1.5). For some reason this idea got stuck in my mind. I have just tried an ordinary user-install of Python 2.6.2 within a Virtual Machine (XP). I experienced no problems at all. So, you are right: this user-rights argument is invalid. 


Well, but much of that space is useless/wasted for
non-Windows users then.


This gave me to the idea to create a package with a slightly different name. It would be an MS Windows only package.  


I see.



Most R users under Windows won't have Rtools installed,
just the developers will have.


I did not know that. So I guess many R users also miss the Perl dependent scripts. 


Right.



Can't you add some configure script / Makefile that allows
to build the binary from sources that you provide in your
package?


Well, that would make the package really bloated. I would first have to build the MinGW compiler, and then I would have to use MinGW to build Python. 


MinGW doesn't need to be in the package, because you do not want to ship 
the binaries.





Otherwise, what you could do is to install the binary on 
demand from another side you are hosting. E.g. 
library("write2xls") could check if the required binary is 
available and install on demand if it is not available. 


This is the kind of idea I am looking for! Thanks very much! Indeed, this would circumvent the whole need of including the Python binaries in an R-package. 


... and perhaps you can even stay with one package rather than two for 
different platforms.


Best wishes,
Uwe


I also like Gabor's idea to run Python scripts from Jython. As he explained, this makes Python available to anyone with access to Java. This might also be acceptable for those users who abhor binary downloads in general. On the other hand it might make the package slow because of longer loading times. 


If MIT allows to ship things the way you plan to, then it's
fine, but no binaries in sources packages on CRAN. We did
quite some work to get rid of the packages that did (even my
own package!) and won't make exceptions. We won't revert our
decision.


Very good! I fully support the decision! 

Best wishes, 


Guido

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Documentation for is.atomic and is.recursive

2009-09-04 Thread Martin Maechler
> "hw" == hadley wickham 
> on Wed, 2 Sep 2009 14:02:06 -0500 writes:

hw> On Wed, Sep 2, 2009 at 1:54 PM, Stavros
hw> Macrakis wrote:
>> On Wed, Sep 2, 2009 at 2:39 PM, Stavros
>> Macrakis wrote:
>> 
>>>  Most types of language objects are regarded as
>>> recursive: those  which are not are the atomic
>>> vector types, 'NULL' and symbols (as  given by
>>> 'as.name').
>>> 
>>> But is.recursive(as.name('foo')) ==
>>> is.recursive(quote(foo)) == FALSE.
>> 
>> Sorry, this *is* consistent with the behavior.  But if we
>> read "the atomic vector types, 'NULL' and symbols" as a
>> list of mutually exclusive categories, then
>> is.atomic(NULL)==FALSE is inconsistent.

hw> And the sentence could be more clearly written as:

hw> Most types of language objects are regarded as
hw> recursive, except for atomic vector types, 'NULL' and
hw> symbols (as given by 'as.name').

yes, that's a shorter and more elegant.
But before amending that,  why 
"language objects" instead of just "R objects" or "objects" ?
  
In the context of S and R when I'd hear  "language objects",
I'd think of the results of
expression() , formula(), substitute(), quote()
i.e., objects for which  is.language() was true.

So, I'm proposing

  Most types of objects are regarded as recursive, except for
  atomic vector types, \code{NULL} and symbols (as given by
  \code{\link{as.name}}).

--
Martin Maechler

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Apparent bug in summaryBy (PR#13941)

2009-09-04 Thread paterno
Full_Name: Marc Paterno
Version: 2.9.2
OS: Mac OS X 10.5.8
Submission from: (NULL) (99.53.212.55)


summaryBy() produces incorrect results when given some data  frames. Below is a
transcript of a session showing the result, in a data frame with 2 observations
of 2 variables.
---
thomas:999 paterno$ R --vanilla

R version 2.9.2 (2009-08-24)
Copyright (C) 2009 The R Foundation for Statistical Computing
ISBN 3-900051-07-0

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

> library(doBy)
> tmp = read.table("moduledata_999_1.txt",header=FALSE)
> str(tmp)
'data.frame':   2 obs. of  2 variables:
 $ V1: Factor w/ 2 levels "b","c": 2 1
 $ V2: num  1 2
> tmp
  V1 V2
1  c  1
2  b  2
> summaryBy(V2~V1,tmp)
  V1 V2.mean
1  b   1
2  c   2

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] building r packages for windows on a mac/linux

2009-09-04 Thread Hin-Tak Leung
--- On Thu, 3/9/09, Vinh Nguyen  wrote:

> hi hin-tak,
> 
> i'm trying to build r packages for windows on a
> mac/linux.  i guess
> this used to possible and supported, but is no longer
> supported.  i
> ran into this post of yours,
> https://stat.ethz.ch/pipermail/r-devel/2009-July/053971.html,
> and hope
> u don't mind me emailing you.
> 
> how did you set up your system to do this sort of
> thing?  i guess the
> only thing i don't get from your post is
> .  what does
> this refer to?  i do have mingw for macs, taken from
> http://crossgcc.rts-software.org/doku.php
> .  i tried compiling a
> package using your method but it didn't work, couldn't find
> things
> such as R.h.  i'm pretty sure it is the
>  because i don't
> know what you are referring to with this.
> 
> i tried building R using mingw, but i got to the following
> error:
> sh: ../../../bin/Rterm.exe: cannot execute binary file
> make[2]: *** [all] Error 126
> make[1]: *** [R] Error 1
> make: *** [all] Error 2
> 
> can you guide me in the right direction?  thanks.

 stands for 'windows R top directory' - you need both native R and 
win32 R to cross-compile R packages. (native R for executing R code, win32 R 
for its R.dll for the cross-compiler's linker to resolve symbols) Go back to R 
2.8.x and study the cross-compile instructions and make sure that works, before 
attempting cross-compile with R 2.9.x .

FWIW, I just built win32 snpMatrix against R 2.9.2 last week and released that, 
and also managed to build the chm windows-help file for the first time.. These 
days, R packages for different platform only differ by the dll/so (and to a 
lesser extent, things like the chm file) so you just need to build the dll/so 
to go from one platform to another. chm file building is documented by others. 





__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] building r packages for windows on a mac/linux

2009-09-04 Thread Hin-Tak Leung
--- On Thu, 3/9/09, Vinh Nguyen  wrote:

> hmmmtried building R-2.8.0 on my
> mac, didn't work.  i think it got
> the very end before failing:
> i386-mingw32-windres --preprocessor="i386-mingw32-gcc -E
> -xc
> -DRC_INVOKED" -I
> /Users/vinh/Downloads/Rwin/R-2.8.0/include  -I 
> -i
> methods_res.rc -o methods_res.o
> i386-mingw32-gcc  -std=gnu99  -shared -s  -o
> methods.dll methods.def
> class_support.o do_substitute_direct.o init.o
> methods_list_dispatch.o
> slot.o tests.o methods_res.o 
> -L/Users/vinh/Downloads/Rwin/R-2.8.0/bin
>-lR
>   ... DLL made
>   installing DLL
>   collecting R files
>   preparing package methods for lazy loading
>   dumping R code in package `methods'
> cp:
> /Library/Frameworks/R.framework/Resources/library/methods/libs/methods.so:
> No such file or directory
> make[4]: ***
> [/Users/vinh/Downloads/Rwin/R-2.8.0/library/methods/R/methods.rdb]
> Error 1
> make[3]: *** [all] Error 2
> make[2]: *** [pkg-methods] Error 2
> make[1]: *** [rpackage] Error 1
> make: *** [all] Error 2

It is probably wiser to use the last of 2.8 (i.e. 2.8.1patched)

That seems to be buggy - anyhow, I said to build win32 R package you need to 
have the win32 R.dll, I don't mean you have to build it yourself. You can just 
take it out of the official win32 R installer.

> 
> if i go to bin/R.exe, my wine opens R fine.  so i
> think this is OK.
> my ultimate goal would be to build a package for
> windows.  looking
> through the admin file, i don't understand how i would do
> this (even
> if i didn't get that previous error).  could u kindly
> point me in the
> direction of how to do this?  make what?

The instruction for cross-compiling R packages is in a file called 
"README.packages" or some such under src/gnuwin32 . The file was removed in R 
2.9.x (as the content is no longer supported).

BTW, you have not shown your affiliation nor the reason why you want to go this 
way - and I am reluctant to do one-on-one hand-holding on annonymity. So please 
keep the CC:, or arrange for commercial consultancy.



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] calling Lapack and BLAS routines from C

2009-09-04 Thread Marcin Hitczenko
Hi,

I am working on a UNIX machine and I am interfacing R and C with the .C
function. I am trying to call LAPACK and BLAS routines, but am running
into a problem where, while I am able to run the BLAS routines, I cannot
run the LAPACK routines.

I compile my .c file (at end of email) in the following way:

[mhitc...@jlogin2 ~/Cstuff]$ R CMD SHLIB testmore.c

gcc -std=gnu99 -I/home/mhitczen/R-2.9.0/include  -I/usr/local/include   
-fpic  -g -O2 -c testmore.c -o testmore.o
gcc -std=gnu99 -shared -L/usr/local/lib -o testmore.so testmore.o

However, I get the following error in R:

> dyn.load("testmore.so")
Error in dyn.load("testmore.so") :
  unable to load shared library '/home/mhitczen/Cstuff/testmore.so':
/home/mhitczen/Cstuff/testmore.so: undefined symbol: dpotrf_

This error goes away and everything works when I simply call the BLAS
routine (In the testmore.c file I simply remove funct2, leaving funct
which calls the routine "dgemm" found in the BLAS.h file).

I read the "Writing R Extensions" and created a file:
/R-2.9.0/src/Makevars that contains the line:
PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)

What am I doing wrong? Thanks for your time and effort.

My testmore.c file is:

#include 
#include 
#include 
#include 
#include 
#include 
#include 

// FUNCTION CALLING BLAS
void funct(double *x, int *dim, double *y, double *result){

  double one=1.0; double zero =0.0;
  char *transa="N"; char *transb="N";

  F77_CALL(dgemm)
(transa,transb,dim,dim,dim,&one,x,dim,y,dim,&zero,result,dim);
}

// FUNCTION CALLING LAPACK
void funct4(double *x, int *dim, double *result){

  char *UorL="L";
  int info=0;

  F77_CALL(dpotrf)(UorL,dim,x,dim, &info);
}



Best,

Marcin Hitczenko

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Suggestion: Allow packages to add additional information to sessionInfo()

2009-09-04 Thread Friedrich Leisch
> On Thu, 3 Sep 2009 11:10:31 -0700,
> Henrik Bengtsson (HB) wrote:

  > On Thu, Sep 3, 2009 at 10:38 AM, Kevin R.
  > Coombes wrote:
  >> [1] I agree that sessionInfo() can be taken further.
  >> [2] I even more strongly agree that it would be a bad idea to allow 
packages
  >> to add features that cause the base sessionInfo() function to fail.
  >> 
  >> Why not add an extra function called something like "packageSessionInfo()"
  >> that would provide the desired hooks but keep them from breaking the base
  >> functionality?

  > The point is that (if so) there should only be *one function* to call
  > for all packages, not one per package, because that would be a pain
  > due to dependencies.  But, sure I'm happy to start with a
  > package[s]SessionInfo() such that

  > c(sessionInfo(), extras=packagesSessionInfo())

  > pretty much return what I wish. Then it might be easier to argue for
  > incorporating the above in sessionInfo() ;)

  > Sorry for not getting it, but I still don't see how adding extra
  > information would break the base functionality?  Can you give some
  > examples?

  > As I said, timeouts can be a problem and possibly also if the hook
  > functions have side effects that, say, would load new packages, could
  > give funny results, but I also think a package developer who is
  > capable to setting up hook function would no how to avoid this.

  > With the default argument of 'extras' to be FALSE, sessionInfo() would
  > work as now, with the extra feature that 'extras=TRUE' can give lots
  > of additional useful information.

I think the concept of hook functions for sessionInfo() makes absolute
sense. Yes it should be optional to run them, but the default should
be pkghooks=TRUE, because I don't see why they shouldn't run OK in
99.9% of all cases. If a hook doesn't run on a certain platform that
would be a bug to me and need to be fixed. Could those who seem to
think such hooks are not a good idea elaborate on what the "danger"
really is? 

Best,
Fritz

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] calling Lapack and BLAS routines from C

2009-09-04 Thread Prof Brian Ripley

On Fri, 4 Sep 2009, Marcin Hitczenko wrote:


Hi,

I am working on a UNIX machine and I am interfacing R and C with the .C
function. I am trying to call LAPACK and BLAS routines, but am running
into a problem where, while I am able to run the BLAS routines, I cannot
run the LAPACK routines.

I compile my .c file (at end of email) in the following way:

[mhitc...@jlogin2 ~/Cstuff]$ R CMD SHLIB testmore.c

gcc -std=gnu99 -I/home/mhitczen/R-2.9.0/include  -I/usr/local/include
-fpic  -g -O2 -c testmore.c -o testmore.o
gcc -std=gnu99 -shared -L/usr/local/lib -o testmore.so testmore.o

However, I get the following error in R:


dyn.load("testmore.so")

Error in dyn.load("testmore.so") :
 unable to load shared library '/home/mhitczen/Cstuff/testmore.so':
/home/mhitczen/Cstuff/testmore.so: undefined symbol: dpotrf_

This error goes away and everything works when I simply call the BLAS
routine (In the testmore.c file I simply remove funct2, leaving funct
which calls the routine "dgemm" found in the BLAS.h file).

I read the "Writing R Extensions" and created a file:
/R-2.9.0/src/Makevars that contains the line:
PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)


That is not where it tells you to put the file.  It needs to be in the 
direcctory where you run the compile (normally the 'src' directory of 
a package), and in your case that seems to be ~/Cstuff.


That BLAS works is fortuitous, depending on the way you built R.

[...]

--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] building r packages for windows on a mac/linux

2009-09-04 Thread Uwe Ligges
For those who have no confidential packages without license issues, want 
to build a Windows binary but do not have Windows available: please also 
note the service provided at

http://win-builder.r-project.org/

Uwe Ligges



Hin-Tak Leung wrote:

--- On Thu, 3/9/09, Vinh Nguyen  wrote:


hmmmtried building R-2.8.0 on my
mac, didn't work.  i think it got
the very end before failing:
i386-mingw32-windres --preprocessor="i386-mingw32-gcc -E
-xc
-DRC_INVOKED" -I
/Users/vinh/Downloads/Rwin/R-2.8.0/include  -I 
-i

methods_res.rc -o methods_res.o
i386-mingw32-gcc  -std=gnu99  -shared -s  -o
methods.dll methods.def
class_support.o do_substitute_direct.o init.o
methods_list_dispatch.o
slot.o tests.o methods_res.o 
-L/Users/vinh/Downloads/Rwin/R-2.8.0/bin

   -lR
  ... DLL made
  installing DLL
  collecting R files
  preparing package methods for lazy loading
  dumping R code in package `methods'
cp:
/Library/Frameworks/R.framework/Resources/library/methods/libs/methods.so:
No such file or directory
make[4]: ***
[/Users/vinh/Downloads/Rwin/R-2.8.0/library/methods/R/methods.rdb]
Error 1
make[3]: *** [all] Error 2
make[2]: *** [pkg-methods] Error 2
make[1]: *** [rpackage] Error 1
make: *** [all] Error 2


It is probably wiser to use the last of 2.8 (i.e. 2.8.1patched)

That seems to be buggy - anyhow, I said to build win32 R package you need to 
have the win32 R.dll, I don't mean you have to build it yourself. You can just 
take it out of the official win32 R installer.


if i go to bin/R.exe, my wine opens R fine.  so i
think this is OK.
my ultimate goal would be to build a package for
windows.  looking
through the admin file, i don't understand how i would do
this (even
if i didn't get that previous error).  could u kindly
point me in the
direction of how to do this?  make what?


The instruction for cross-compiling R packages is in a file called 
"README.packages" or some such under src/gnuwin32 . The file was removed in R 
2.9.x (as the content is no longer supported).

BTW, you have not shown your affiliation nor the reason why you want to go this 
way - and I am reluctant to do one-on-one hand-holding on annonymity. So please 
keep the CC:, or arrange for commercial consultancy.



__

R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Including a binary Python Interpreter into a binary R-package for MS Windows

2009-09-04 Thread Guido van Steen
--- On Fri, 9/4/09, Uwe Ligges  wrote:

> MinGW doesn't need to be in the package, because you do not
> want to ship the binaries.

I meant that I would have to include the source code of MinGW, in order to 
build the MinGW compiler in some writeable directory of the R-user's computer. 
This is because without MinGW I wouldn't be able to build PyMinGW. (PyMinGW is 
a version of Python that - unlike the standard version of Python - does not 
depend on copy-righted MS dlls.) 

> ... and perhaps you can even stay with one package rather
> than two for different platforms.

Very true!

Best wishes,

Guido

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Apparent bug in summaryBy (PR#13941)

2009-09-04 Thread Duncan Murdoch

pate...@fnal.gov wrote:

Full_Name: Marc Paterno
Version: 2.9.2
OS: Mac OS X 10.5.8
Submission from: (NULL) (99.53.212.55)


summaryBy() produces incorrect results when given some data  frames. Below is a
transcript of a session showing the result, in a data frame with 2 observations
of 2 variables.
  


This looks like a bug in the doBy package (or something it uses), not a 
bug in R.  I've cc'd the maintainer of that package to let him know 
about it. 


By the way, an easier way to create that dataframe is simply

tmp <- data.frame(V1=c("c", "b"), V2=c(1,2))

Duncan Murdoch


---
thomas:999 paterno$ R --vanilla

R version 2.9.2 (2009-08-24)
Copyright (C) 2009 The R Foundation for Statistical Computing
ISBN 3-900051-07-0

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

  

library(doBy)
tmp = read.table("moduledata_999_1.txt",header=FALSE)
str(tmp)


'data.frame':   2 obs. of  2 variables:
 $ V1: Factor w/ 2 levels "b","c": 2 1
 $ V2: num  1 2
  

tmp


  V1 V2
1  c  1
2  b  2
  

summaryBy(V2~V1,tmp)


  V1 V2.mean
1  b   1
2  c   2

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Load a package without installing it

2009-09-04 Thread Hadley Wickham
Hi all,

When developing a package, it's often useful to be able to reload it,
without re-installing, re-starting R and re-loading.  To do this I've
written a little script that inspects the package description and
loads dependencies, data and code - http://gist.github.com/180883.
It's obviously not very general (being tailored to my description
files) and won't work for packages containing C code, but I hope you
might find it useful nonetheless.

Any comments would be appreciated.

Hadley

-- 
http://had.co.nz/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Load a package without installing it

2009-09-04 Thread Romain Francois

On 09/04/2009 03:39 PM, Hadley Wickham wrote:


Hi all,

When developing a package, it's often useful to be able to reload it,
without re-installing, re-starting R and re-loading.  To do this I've
written a little script that inspects the package description and
loads dependencies, data and code - http://gist.github.com/180883.
It's obviously not very general (being tailored to my description
files) and won't work for packages containing C code, but I hope you
might find it useful nonetheless.

Any comments would be appreciated.

Hadley


Nice. I would guess many of us would have versions of this, it would be 
good to formalise it so that it could deal with :
- namespaces, you might want your unexported functions to be separate 
from your exported functions. It looks like your function loads 
everything into .GlobalEnv
- S4 objects, what would happen if you re-source the definition of an S4 
class and continue to manage objects created before the resourcing


Romain

--
Romain Francois
Professional R Enthusiast
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
|- http://tr.im/xMdt : update on the ant package
|- http://tr.im/xHLs : R capable version of ant
`- http://tr.im/xHiZ : Tip: get java home from R with rJava

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Load a package without installing it

2009-09-04 Thread Hadley Wickham
> Nice. I would guess many of us would have versions of this, it would be good
> to formalise it so that it could deal with :
> - namespaces, you might want your unexported functions to be separate from
> your exported functions. It looks like your function loads everything into
> .GlobalEnv
> - S4 objects, what would happen if you re-source the definition of an S4
> class and continue to manage objects created before the resourcing

I use neither namespaces nor S4, so it's no suprise my code doesn't
handle them ;)

Hadley

-- 
http://had.co.nz/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Package tests must have extension .R?

2009-09-04 Thread Hadley Wickham
Is this intentional?  .r is accept most other places.

Hadley

-- 
http://had.co.nz/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rscript and default packages

2009-09-04 Thread Simon Urbanek


On Sep 3, 2009, at 13:52 , Romain Francois wrote:


On 09/03/2009 05:23 PM, Duncan Murdoch wrote:


On 03/09/2009 9:53 AM, Romain Francois wrote:

Hi,

Is is possible to embed inside an R script, the name of the default
packages to be loaded when the script is invoked with Rscript.

I know about the --default-packages argument, but I was wondering if
there was a mechanism to embed this information within the script  
itself


I don't understand what you'd want here that you don't get with  
attach()

or require(). Why does it matter if they are default?

Duncan Murdoch


Sorry for being vague. I am more interested in not loading some  
packages:


$ time Rscript  -e "#"

real0m0.224s
user0m0.188s
sys 0m0.032s


$ time Rscript --default-packages="base" -e "#"

real0m0.067s
user0m0.033s
sys 0m0.016s


$ time r -e "#"

real0m0.039s
user0m0.032s
sys 0m0.006s

This is related to the "How to ship R scripts with R packages"  
thread. I'd like for example to ship a script that I know only  
requires the "base" package. How would I specify this from within  
the script.




Well, what's wrong with:

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real0m0.045s
user0m0.027s
sys 0m0.017s

.. as opposed to

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real0m0.201s
user0m0.166s
sys 0m0.034s

Cheers,
Simon

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Load a package without installing it

2009-09-04 Thread Gabor Grothendieck
This is sufficiently useful that it would be nice to have it
as part of R itself.  For the moment, perhaps you could
make a package of it on CRAN or contribute it to some
other existing CRAN package.

On Fri, Sep 4, 2009 at 9:39 AM, Hadley Wickham wrote:
> Hi all,
>
> When developing a package, it's often useful to be able to reload it,
> without re-installing, re-starting R and re-loading.  To do this I've
> written a little script that inspects the package description and
> loads dependencies, data and code - http://gist.github.com/180883.
> It's obviously not very general (being tailored to my description
> files) and won't work for packages containing C code, but I hope you
> might find it useful nonetheless.
>
> Any comments would be appreciated.
>
> Hadley
>
> --
> http://had.co.nz/
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rscript and default packages

2009-09-04 Thread Duncan Murdoch

On 9/4/2009 10:20 AM, Simon Urbanek wrote:

On Sep 3, 2009, at 13:52 , Romain Francois wrote:


On 09/03/2009 05:23 PM, Duncan Murdoch wrote:


On 03/09/2009 9:53 AM, Romain Francois wrote:

Hi,

Is is possible to embed inside an R script, the name of the default
packages to be loaded when the script is invoked with Rscript.

I know about the --default-packages argument, but I was wondering if
there was a mechanism to embed this information within the script  
itself


I don't understand what you'd want here that you don't get with  
attach()

or require(). Why does it matter if they are default?

Duncan Murdoch


Sorry for being vague. I am more interested in not loading some  
packages:


$ time Rscript  -e "#"

real0m0.224s
user0m0.188s
sys 0m0.032s


$ time Rscript --default-packages="base" -e "#"

real0m0.067s
user0m0.033s
sys 0m0.016s


$ time r -e "#"

real0m0.039s
user0m0.032s
sys 0m0.006s

This is related to the "How to ship R scripts with R packages"  
thread. I'd like for example to ship a script that I know only  
requires the "base" package. How would I specify this from within  
the script.




Well, what's wrong with:

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real0m0.045s
user0m0.027s
sys 0m0.017s

.. as opposed to

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real0m0.201s
user0m0.166s
sys 0m0.034s


Windows doesn't support the #! invocation, but that should work on all 
the other systems that would use scripts.


Duncan Murdoch

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rscript and default packages

2009-09-04 Thread Romain Francois

On 09/04/2009 04:20 PM, Simon Urbanek wrote:



On Sep 3, 2009, at 13:52 , Romain Francois wrote:


On 09/03/2009 05:23 PM, Duncan Murdoch wrote:


On 03/09/2009 9:53 AM, Romain Francois wrote:

Hi,

Is is possible to embed inside an R script, the name of the default
packages to be loaded when the script is invoked with Rscript.

I know about the --default-packages argument, but I was wondering if
there was a mechanism to embed this information within the script
itself


I don't understand what you'd want here that you don't get with attach()
or require(). Why does it matter if they are default?

Duncan Murdoch


Sorry for being vague. I am more interested in not loading some packages:

$ time Rscript -e "#"

real 0m0.224s
user 0m0.188s
sys 0m0.032s


$ time Rscript --default-packages="base" -e "#"

real 0m0.067s
user 0m0.033s
sys 0m0.016s


$ time r -e "#"

real 0m0.039s
user 0m0.032s
sys 0m0.006s

This is related to the "How to ship R scripts with R packages" thread.
I'd like for example to ship a script that I know only requires the
"base" package. How would I specify this from within the script.



Well, what's wrong with:


Great. Just a bit of work to generate the shebang line at INSTALL time 
(Rscript is at /usr/local/bin on my machine).


... or perhaps I could use #!/bin/env


#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real 0m0.045s
user 0m0.027s
sys 0m0.017s

.. as opposed to

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real 0m0.201s
user 0m0.166s
sys 0m0.034s

Cheers,
Simon



--
Romain Francois
Professional R Enthusiast
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
|- http://tr.im/xMdt : update on the ant package
|- http://tr.im/xHLs : R capable version of ant
`- http://tr.im/xHiZ : Tip: get java home from R with rJava

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Rscript and default packages

2009-09-04 Thread Simon Urbanek


On Sep 4, 2009, at 10:20 , Simon Urbanek wrote:



On Sep 3, 2009, at 13:52 , Romain Francois wrote:


On 09/03/2009 05:23 PM, Duncan Murdoch wrote:


On 03/09/2009 9:53 AM, Romain Francois wrote:

Hi,

Is is possible to embed inside an R script, the name of the default
packages to be loaded when the script is invoked with Rscript.

I know about the --default-packages argument, but I was wondering  
if
there was a mechanism to embed this information within the script  
itself


I don't understand what you'd want here that you don't get with  
attach()

or require(). Why does it matter if they are default?

Duncan Murdoch


Sorry for being vague. I am more interested in not loading some  
packages:


$ time Rscript  -e "#"

real0m0.224s
user0m0.188s
sys 0m0.032s


$ time Rscript --default-packages="base" -e "#"

real0m0.067s
user0m0.033s
sys 0m0.016s


$ time r -e "#"

real0m0.039s
user0m0.032s
sys 0m0.006s

This is related to the "How to ship R scripts with R packages"  
thread. I'd like for example to ship a script that I know only  
requires the "base" package. How would I specify this from within  
the script.




Well, what's wrong with:

#!/usr/bin/Rscript --default-packages=base

ginaz:sandbox$ time ./scr

real0m0.045s
user0m0.027s
sys 0m0.017s

.. as opposed to

#!/usr/bin/Rscript --default-packages=base



oops, this one was actually just

#!/usr/bin/Rscript



ginaz:sandbox$ time ./scr

real0m0.201s
user0m0.166s
sys 0m0.034s

Cheers,
Simon

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Suggestion: Allow packages to add additional information to sessionInfo()

2009-09-04 Thread Martin Morgan
Friedrich Leisch wrote:
>> On Thu, 3 Sep 2009 11:10:31 -0700,
>> Henrik Bengtsson (HB) wrote:
> 
>   > On Thu, Sep 3, 2009 at 10:38 AM, Kevin R.
>   > Coombes wrote:
>   >> [1] I agree that sessionInfo() can be taken further.
>   >> [2] I even more strongly agree that it would be a bad idea to allow 
> packages
>   >> to add features that cause the base sessionInfo() function to fail.
>   >> 
>   >> Why not add an extra function called something like 
> "packageSessionInfo()"
>   >> that would provide the desired hooks but keep them from breaking the base
>   >> functionality?
> 
>   > The point is that (if so) there should only be *one function* to call
>   > for all packages, not one per package, because that would be a pain
>   > due to dependencies.  But, sure I'm happy to start with a
>   > package[s]SessionInfo() such that
> 
>   > c(sessionInfo(), extras=packagesSessionInfo())
> 
>   > pretty much return what I wish. Then it might be easier to argue for
>   > incorporating the above in sessionInfo() ;)
> 
>   > Sorry for not getting it, but I still don't see how adding extra
>   > information would break the base functionality?  Can you give some
>   > examples?
> 
>   > As I said, timeouts can be a problem and possibly also if the hook
>   > functions have side effects that, say, would load new packages, could
>   > give funny results, but I also think a package developer who is
>   > capable to setting up hook function would no how to avoid this.
> 
>   > With the default argument of 'extras' to be FALSE, sessionInfo() would
>   > work as now, with the extra feature that 'extras=TRUE' can give lots
>   > of additional useful information.
> 
> I think the concept of hook functions for sessionInfo() makes absolute
> sense. Yes it should be optional to run them, but the default should
> be pkghooks=TRUE, because I don't see why they shouldn't run OK in
> 99.9% of all cases. If a hook doesn't run on a certain platform that
> would be a bug to me and need to be fixed. Could those who seem to
> think such hooks are not a good idea elaborate on what the "danger"
> really is? 

In Bioconductor sessionInfo() is an essential tool for basic problem
diagnosis. Very often the culprit is a mismatch between package
versions, which are easy to spot because Bioconductor package versions
are incremented with each R release. So a brief sessionInfo() is really
good. Also, the typical exchange is 'this is my problem' 'sounds like a
package version mismatch, what is your sessionInfo()?' 'this is my
sessionInfo()' 'package x is out of date, follow the directions here...'
though sometimes the middle two steps are skipped when the user provides
sessionInfo up front.

Package dependencies in Bioconductor tend to be more complicated than on
CRAN, so a user might have a dozen loaded and attached packages.
Additional sessionInfo() for each would make it difficult to identify
the most common problem (version mismatch). Since issues are usually
with only one or two of the packages in the session, custom sessionInfo
from all would largely be irrelevant. Since the exchange typically
involves a prompt for sessionInfo() from the user, when the initial
report hints at problems with a specific package, the initial reply
might be 'what is your sessionInfo(packages="LikelyCulprit")?'. There is
also a sense in which sessionInfo() provides a useful distinction
between 'these things are R's business, to manage the search path,
locale, etc' and the business of the package maintainer.

Each Bioconductor release cycle involves maintenance of packages to
track changes in R. This will be the Achilles heal of package-specific
sessionInfo, where a well-meaning developer introduces code that becomes
incompatible in a difficult to detect way, and only rears its head at
the most important time -- when sessionInfo() would be useful for
diagnosis of problems. One not too far-fetched example from a slightly
different context was a change in the internal representation of S4
objects that caused some vignettes to print out the entire object (many
100's of pages) rather than the summary intended. Yes these are bugs,
and I guess 'package x is producing complete garbage for sessionInfo,
likely because your package is out of date. Please update your package
and provide sessionInfo again' might well serve to address the user's
original problem, but it would have been better to spot the outdated
package by looking at the version number produced by sessionInfo().

Martin Morgan

> Best,
> Fritz
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] enabling core dumps

2009-09-04 Thread pleydell
"Writing R Extensions" says

{quotes}
If you have a crash which gives a core dump you can use something like

 gdb /path/to/R/bin/exec/R core.12345

to examine the core dump. If core dumps are disabled...
{unquotes}

sadly it doesn't go on to say how to enable if core dumps are disabled.

I understand that in bash I need to do

$ ulimit -c unlimited

but this doesn't seem to be enough, I still don't find a core file despite

  *** caught segfault ***
 address 0x2028, cause 'memory not mapped'

Possible actions:
 1: abort (with core dump)
 2: normal R exit
 3: exit R without saving workspace
 4: exit R saving workspace
 Selection: 1


I am running Ubuntu jaunty on a laptop. Any ideas as to what I might need to
configure next?

thanks
David






-- 
David Pleydell
UMR BGPI
CIRAD
TA A-54/K
Campus International de Baillarguet
34398 MONTPELLIER CEDEX 5
FRANCE
Tel: +33 4 99 62 48 65 - Secrétariat : +33 4 99 62 48 21
Fax : +33 4 99 62 48 22
http://umr-bgpi.cirad.fr/trombinoscope/pleydell_d.htm
https://sites.google.com/site/drjpleydell/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread pleydell

I forgot to add that I am compiling with

R CMD SHLIB buggyCode.c --ggdb

thanks
David




Quoting pleyd...@supagro.inra.fr:


"Writing R Extensions" says

{quotes}
If you have a crash which gives a core dump you can use something like

gdb /path/to/R/bin/exec/R core.12345

to examine the core dump. If core dumps are disabled...
{unquotes}

sadly it doesn't go on to say how to enable if core dumps are disabled.

I understand that in bash I need to do

$ ulimit -c unlimited

but this doesn't seem to be enough, I still don't find a core file despite

 *** caught segfault ***
address 0x2028, cause 'memory not mapped'

   Possible actions:
1: abort (with core dump)
2: normal R exit
3: exit R without saving workspace
4: exit R saving workspace
Selection: 1


I am running Ubuntu jaunty on a laptop. Any ideas as to what I might need to
configure next?

thanks
David






--
David Pleydell
UMR BGPI
CIRAD
TA A-54/K
Campus International de Baillarguet
34398 MONTPELLIER CEDEX 5
FRANCE
Tel: +33 4 99 62 48 65 - Secrétariat : +33 4 99 62 48 21
Fax : +33 4 99 62 48 22
http://umr-bgpi.cirad.fr/trombinoscope/pleydell_d.htm
https://sites.google.com/site/drjpleydell/





--
David Pleydell
UMR BGPI
CIRAD
TA A-54/K
Campus International de Baillarguet
34398 MONTPELLIER CEDEX 5
FRANCE
Tel: +33 4 99 62 48 65 - Secrétariat : +33 4 99 62 48 21
Fax : +33 4 99 62 48 22
http://umr-bgpi.cirad.fr/trombinoscope/pleydell_d.htm
https://sites.google.com/site/drjpleydell/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread Martin Morgan
pleyd...@supagro.inra.fr wrote:
> "Writing R Extensions" says
> 
> {quotes}
> If you have a crash which gives a core dump you can use something like
> 
>  gdb /path/to/R/bin/exec/R core.12345
> 
> to examine the core dump. If core dumps are disabled...
> {unquotes}
> 
> sadly it doesn't go on to say how to enable if core dumps are disabled.
> 
> I understand that in bash I need to do
> 
> $ ulimit -c unlimited
> 
> but this doesn't seem to be enough, I still don't find a core file despite
> 
>   *** caught segfault ***
>  address 0x2028, cause 'memory not mapped'
> 
> Possible actions:
>  1: abort (with core dump)
>  2: normal R exit
>  3: exit R without saving workspace
>  4: exit R saving workspace
>  Selection: 1
> 
> 
> I am running Ubuntu jaunty on a laptop. Any ideas as to what I might need to
> configure next?

not really answering your question, but I find it more useful to

  R -d gdb

or

  R -d gdb -f test.R

where test.R reproduces the bug in some minimal code. A variant is

  R -d valgrind -f test.R

if the memory problem is not easy to spot.

Martin

> 
> thanks
> David
> 
> 
> 
> 
> 
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] building r packages for windows on a mac/linux

2009-09-04 Thread Vinh Nguyen
thank you professor Ligges for the build farm.  i was aware of it.
however, i'm just trying to learn how to build things myself.  thanks
to hin-tak, i did successfully cross build on R 2.8 on my mac.  i will
try to cross build r 2.9 next based on hin-tak's suggestions.

i don't know the exact reason why the R core team dropped support for
the cross build, but if it's because of the perceived 'lack of
demand,' i'd just like to note and show there is some demand (me), so
hopefully support will be back.

thanks.
vinh
--
This e-mail/fax message, including any attachments, is for the sole
use of the intended recipient(s) and may contain confidential and
privileged information. Any unauthorized review, use, disclosure or
distribution is prohibited. If you are not the intended recipient,
please contact the sender by reply e-mail/fax and destroy all copies
of the original message.



On Fri, Sep 4, 2009 at 2:12 AM, Uwe
Ligges wrote:
> For those who have no confidential packages without license issues, want to
> build a Windows binary but do not have Windows available: please also note
> the service provided at
> http://win-builder.r-project.org/
>
> Uwe Ligges
>
>
>
> Hin-Tak Leung wrote:
>>
>> --- On Thu, 3/9/09, Vinh Nguyen  wrote:
>>
>>> hmmmtried building R-2.8.0 on my
>>> mac, didn't work.  i think it got
>>> the very end before failing:
>>> i386-mingw32-windres --preprocessor="i386-mingw32-gcc -E
>>> -xc
>>> -DRC_INVOKED" -I
>>> /Users/vinh/Downloads/Rwin/R-2.8.0/include  -I -i
>>> methods_res.rc -o methods_res.o
>>> i386-mingw32-gcc  -std=gnu99  -shared -s  -o
>>> methods.dll methods.def
>>> class_support.o do_substitute_direct.o init.o
>>> methods_list_dispatch.o
>>> slot.o tests.o methods_res.o -L/Users/vinh/Downloads/Rwin/R-2.8.0/bin
>>>   -lR
>>>  ... DLL made
>>>  installing DLL
>>>  collecting R files
>>>  preparing package methods for lazy loading
>>>  dumping R code in package `methods'
>>> cp:
>>>
>>> /Library/Frameworks/R.framework/Resources/library/methods/libs/methods.so:
>>> No such file or directory
>>> make[4]: ***
>>> [/Users/vinh/Downloads/Rwin/R-2.8.0/library/methods/R/methods.rdb]
>>> Error 1
>>> make[3]: *** [all] Error 2
>>> make[2]: *** [pkg-methods] Error 2
>>> make[1]: *** [rpackage] Error 1
>>> make: *** [all] Error 2
>>
>> It is probably wiser to use the last of 2.8 (i.e. 2.8.1patched)
>>
>> That seems to be buggy - anyhow, I said to build win32 R package you need
>> to have the win32 R.dll, I don't mean you have to build it yourself. You can
>> just take it out of the official win32 R installer.
>>
>>> if i go to bin/R.exe, my wine opens R fine.  so i
>>> think this is OK.
>>> my ultimate goal would be to build a package for
>>> windows.  looking
>>> through the admin file, i don't understand how i would do
>>> this (even
>>> if i didn't get that previous error).  could u kindly
>>> point me in the
>>> direction of how to do this?  make what?
>>
>> The instruction for cross-compiling R packages is in a file called
>> "README.packages" or some such under src/gnuwin32 . The file was removed in
>> R 2.9.x (as the content is no longer supported).
>>
>> BTW, you have not shown your affiliation nor the reason why you want to go
>> this way - and I am reluctant to do one-on-one hand-holding on annonymity.
>> So please keep the CC:, or arrange for commercial consultancy.
>>
>>
>>    __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread pleydell

not really answering your question, but I find it more useful to

 R -d gdb

or

 R -d gdb -f test.R

where test.R reproduces the bug in some minimal code. A variant is

 R -d valgrind -f test.R

if the memory problem is not easy to spot.


Thanks for your reply Martin

Yes, I have used that route before, I have also been playing with the 
emacs "M-x
gdb" option as describe in the R FAQ. But having no first hand 
expertience with
core dumps I am curious why you prefer the -d flag route. Perhaps I'm 
wrong but
I thought examining a core dump enables you to backtrace from the 
moment things

went wrong, this seems to be a useful trick to have...

... if you manage to enable the core dump option that is.

cheers
David

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread pleydell

To answer my own question.

My mistake was that "ulimit -c unlimited" applies to the current bash session
only. I had used this call in a bash *shell* buffer in emacs but this was
unable to affect R processes started in emacs with C-u M-x R, hence no core
files. Running the buggy code from R started in a bash shell after running
ulimit resulted in a core file being generated in the R working directory.

It's not the cleanest of emacs solutions, but at least it works.

David

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread Martin Morgan
pleyd...@supagro.inra.fr wrote:
>> not really answering your question, but I find it more useful to
>>
>>  R -d gdb
>>
>> or
>>
>>  R -d gdb -f test.R
>>
>> where test.R reproduces the bug in some minimal code. A variant is
>>
>>  R -d valgrind -f test.R
>>
>> if the memory problem is not easy to spot.
> 
> Thanks for your reply Martin
> 
> Yes, I have used that route before, I have also been playing with the
> emacs "M-x
> gdb" option as describe in the R FAQ. But having no first hand
> expertience with
> core dumps I am curious why you prefer the -d flag route. Perhaps I'm
> wrong but
> I thought examining a core dump enables you to backtrace from the moment
> things
> went wrong, this seems to be a useful trick to have...
> 
> ... if you manage to enable the core dump option that is.

usually what happens is (# meant to be a comment char)

  % R -d gdb -f test.R
  gdb> run
  ...segfault happens, breaks into gdb
  gdb> bt # print the backtrace
  gdb> up # move up the stack, to get to 'your' frame
  gdb> l # show source listing, use -O0 compiler flag, see gdb> help dir
  gdb> print some_suspect_variable
  gdb> call Rf_PrintValue(some_suspect_sexp)
  gdb> break suspect_function
  gdb> run # restart script, but break at suspect_function

and so on, i.e., you've got all the info you need. A neat trick is to
leave gdb running, repair and R CMD SHLIB your C code, return to gdb and

  gdb> run

to restart the same script but using the new shared lib (possibly
preserving breakpoints and other debugging info you'd used in previous
sessions).

I'm a heavy emacs user but find it easier to stick with gdb from the
shell -- one less layer to get in the way, when I'm confused enough as
it is.

Martin
> 
> cheers
> David
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread pleydell

usually what happens is (# meant to be a comment char)

 % R -d gdb -f test.R
 gdb> run
 ...segfault happens, breaks into gdb
 gdb> bt # print the backtrace
 gdb> up # move up the stack, to get to 'your' frame
 gdb> l # show source listing, use -O0 compiler flag, see gdb> help dir
 gdb> print some_suspect_variable
 gdb> call Rf_PrintValue(some_suspect_sexp)
 gdb> break suspect_function
 gdb> run # restart script, but break at suspect_function

and so on, i.e., you've got all the info you need. A neat trick is to
leave gdb running, repair and R CMD SHLIB your C code, return to gdb and

 gdb> run

to restart the same script but using the new shared lib (possibly
preserving breakpoints and other debugging info you'd used in previous
sessions).

I'm a heavy emacs user but find it easier to stick with gdb from the
shell -- one less layer to get in the way, when I'm confused enough as
it is.


Wow! Thanks for the detailed reply, your approach makes perfect sense...

... especially given that my core file was for some unknown reason 0 
bytes which

gdb didn't find too funny.

cheers
David

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] enabling core dumps

2009-09-04 Thread Simon Urbanek

On Sep 4, 2009, at 12:11 , pleyd...@supagro.inra.fr wrote:


not really answering your question, but I find it more useful to

R -d gdb

or

R -d gdb -f test.R

where test.R reproduces the bug in some minimal code. A variant is

R -d valgrind -f test.R

if the memory problem is not easy to spot.


Thanks for your reply Martin

Yes, I have used that route before, I have also been playing with  
the emacs "M-x
gdb" option as describe in the R FAQ. But having no first hand  
expertience with
core dumps I am curious why you prefer the -d flag route. Perhaps  
I'm wrong but
I thought examining a core dump enables you to backtrace from the  
moment things

went wrong, this seems to be a useful trick to have...



.. this is the same with gdb - the core dump just saves the same state  
that you have at the point where gdb comes in. The only practical  
difference (I'm aware of) is that with core dump you can repeatedly re- 
start your analysis back to the point of crash without running all the  
code that has lead to it at the cost of storing the entire memory  
content (which can be huge these days so often using gdb is sufficient  
and faster...).




... if you manage to enable the core dump option that is.



ulimit -c unlimited
is the right way, but you a) have to have rights to do than and b) the  
kernel must have core dumps enabled. On Debian I have no issue:


urba...@corrino:~$ ulimit -c unlimited
urba...@corrino:~$ R --slave < segfault.R

 *** caught segfault ***
address 0x3, cause 'memory not mapped'

Possible actions:
1: abort (with core dump, if enabled)
2: normal R exit
3: exit R without saving workspace
4: exit R saving workspace
Selection: 1
aborting ...
Segmentation fault (core dumped)

Note the "core dumped" in the last line. Also some systems put core  
dumps in a dedicated directory, not in the current one. If in doubt,  
google for core dumps and your distro - this is not really an R  
issue ...


Cheers,
Simon

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Dependencies of packages' CHECK....

2009-09-04 Thread Allen S. Rout


> On Sep 2, 2009, at 2:53 , Allen S. Rout wrote:
>
>> I'm working to automate the building of RPM packages for CRAN &c.
>> In the process, I'm trying to get a sense of the correct
>> dependencies.
>>
>> [...] 
>>
>> In other words, to check properly, I need to treat Suggests and
>> Imports as Depends.
>>
>> [...]
>>
>> So: does this seem silly, or is that just The Way It is?


Uwe Ligges  writes:

> Yes, it is the way it is says the Windows binary package maintainer.

Simon Urbanek  writes:

> [ ... Yep ... ] 


OK, thanks.  Clear is good. :) There followed some offline discussion
with Simon, in which he patiently relieved me of some measure of
ignorance.

I'm chewing on this so hard because I want to do things The Right Way.
Here's the Most Right I've gotten so far; Please correct and/or throw
vegetables as indicated.



1) Express binary package dependencies according to Depends and Imports.
   I'll call this the 'narrow dependency graph'. 

2) As part of the binary package build process, run CHECK
   with R_CHECK_FORCE_SUGGESTS = false. 

I'll pull nomenclature out of my ear and call these "built" but not
"checked".

3) Build all binary packages which are downstream according to all of
   Depends, Imports, Suggests, and Extends.  I'll call this the 'broad
   dependency graph'.

4) Install all the packages in the broad dependency graph.

5) for each package in the broad graph, run CHECK with
   R_CHECK_FORCE_SUGGESTS=true.

Then the affected packages are "checked".  Perhaps this can be noted
with a signature.



I'd like to get at least to "Well, that doesn't sound too stupid"
before I turn around trying to sell that evolution to the RPM-flavored
list. :)



- Allen S. Rout

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Including a binary Python Interpreter into a binary R-package for MS Windows

2009-09-04 Thread Guido van Steen
Hi Gabor, 

--- On Thu, 9/3/09, Gabor Grothendieck  wrote:

> I've tried both these approaches.
> 
> See Ryacas source code to see an example of the binary
> download approach.
> It has the advantage that the software loads and runs
> faster.
> 
> Nevertheless, I moved from the binary download approach to
> the Jython approach
> for my next package, rSymPy, since Jython gives a single
> approach that works
> on all platforms and installation becomes a "no brainer"
> since its all
> self contained.

Unzipping an archive with e.g. "python.exe" and "python25.dll" should also be 
quite easy. Python's main modules would all be included in "python25.dll". For 
add-on modules you could use a modified import mechanism, so that you can place 
these modules one level deeper in the directory structure. 

> In fact, in most cases its just a matter of:
> install.packages("rSymPy").
> like any other package.
> 
> Thus while its true that Jython is slower but if usrs can't
> install it
> in the first place
> or its too difficult to install it no one will try it in
> the first
> place and then it
> does not matter how fast it is since it will be unused or
> less used.
> 
> One caveat is that although Jython does make installation
> much easier
> its still possible to have problems.  e.g. user does
> not have
> Java or has wrong version of Java or needs certain
> permissions.  On Vista
> they may need to run R elevated.  

I can imagine. 

> I expect that as new
> versions of Jython
> become available (in fact a more recent one became
> available after the
> last release of rSymPy) things will further improve.
> 
> For more info on possible problems with each approach see
> the
> troubleshooting sections of Ryacas and rSymPy home pages:
> 
> http://Ryacas.googlecode.com
> http://rSymPy.googlecode.com

Thanks. I will take a look at both projects. 

Guido 




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] asking for suggestions: interface for a C++ class

2009-09-04 Thread Yurii Aulchenko

Dear All,

I would like to have an advice for designing an R library, and thought  
that R-devel may be the best place to ask given so many people who are  
highly expert in R are around.


We are at an early stage of designing an R library, which is  
effectively an interface to a C++ library providing fast access to  
large matrices stored on HDD as binary files. The core of the C++  
library is relatively sophisticated class, which we try to "mirror"  
using an S4 class in R. Basically when a new object of that class is  
initiated, the C++ constructor is called and essential elements of the  
new object are reflected as slots of the R object.


Now as you can imagine the problem is that if the R object is removed  
using say "rm" command, and not our specifically designed one, the C++  
object still hangs around in RAM until R session is terminated. This  
is not nice, and also may be a problem, as the C++ object may allocate  
large part of RAM. We can of cause replace generic "rm" and "delete"  
functions, but this is definitely not a nice solution.


Sounds like rather common problem people may face, but unfortunately I  
was not able to find a solution.


I will greatly appreciate any suggestions.

many thanks in advance,
Yurii

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] asking for suggestions: interface for a C++ class

2009-09-04 Thread Seth Falcon
* On 2009-09-04 at 22:54 +0200 Yurii Aulchenko wrote:
> We are at an early stage of designing an R library, which is effectively an 
> interface to a C++ library providing fast access to large matrices stored 
> on HDD as binary files. The core of the C++ library is relatively 
> sophisticated class, which we try to "mirror" using an S4 class in R. 
> Basically when a new object of that class is initiated, the C++ constructor 
> is called and essential elements of the new object are reflected as slots 
> of the R object.

Have a look at external pointers as described in the Writing R
Extensions Manual.

> Now as you can imagine the problem is that if the R object is removed using 
> say "rm" command, and not our specifically designed one, the C++ object 
> still hangs around in RAM until R session is terminated. This is not nice, 
> and also may be a problem, as the C++ object may allocate large part of 
> RAM. We can of cause replace generic "rm" and "delete" functions, but this 
> is definitely not a nice solution.

You likely want a less literal translation of your C++ object into R's
S4 system.  One slot should be an external pointer which will give you
the ability to define a finalizer to clean up when the R level object
gets gc'd.

+ seth

-- 
Seth Falcon | @sfalcon | http://userprimary.net/user

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] asking for suggestions: interface for a C++ class

2009-09-04 Thread Simon Urbanek

Yurii,

On Sep 4, 2009, at 16:54 , Yurii Aulchenko wrote:


Dear All,

I would like to have an advice for designing an R library, and  
thought that R-devel may be the best place to ask given so many  
people who are highly expert in R are around.


We are at an early stage of designing an R library, which is  
effectively an interface to a C++ library providing fast access to  
large matrices stored on HDD as binary files.


[FWIW there are already several packages that do waht you describe -  
see e.g. ff, bigMemory, nws, ...]



The core of the C++ library is relatively sophisticated class, which  
we try to "mirror" using an S4 class in R. Basically when a new  
object of that class is initiated, the C++ constructor is called and  
essential elements of the new object are reflected as slots of the R  
object.


Now as you can imagine the problem is that if the R object is  
removed using say "rm" command, and not our specifically designed  
one, the C++ object still hangs around in RAM until R session is  
terminated.


You must have some link between the S4 object and your C++ object -  
ideally an external pointer - so all you have to do is to attach a  
finalizer to it via R_RegisterCFinalizer or R_RegisterCFinalizerEx. In  
that finalizer you simply free the C++ object and all is well.


Note that R uses a garbage collector so the object won't go away  
immediately after it went out of scope - only after R thinks it needs  
to reclaim memory. You can use gc() to force garbage collection to  
test it.



This is not nice, and also may be a problem, as the C++ object may  
allocate large part of RAM. We can of cause replace generic "rm" and  
"delete" functions, but this is definitely not a nice solution.




... and it doesn't tackle the issue - objects can go out of scope by  
other means than just rm(), e.g.:

f <- function() { ...; myGreatObject }
f()
# the great object is gone now since it was not assigned anywhere


Sounds like rather common problem people may face, but unfortunately  
I was not able to find a solution.




Cheers,
Simon

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Viewing pdfs from inst/doc

2009-09-04 Thread rudjer

Writing R extensions says:  

In addition to the help files in Rd format, R packages allow the inclusion
of documents in arbitrary other formats. The standard location for these is
subdirectory inst/doc of a source package, the contents will be copied to
subdirectory doc when the package is installed. Pointers from package help
indices to the installed documents are automatically created. Documents in
inst/doc can be in arbitrary format, however we strongly recommend to
provide them in PDF format, such that users on all platforms can easily read
them.

My question is easily How?  The function vignette() provides a convenient
way to read properly Sweaved 
vignettes, but what about plain old pdfs that someone like me would like to
stick into inst/doc and
then view?  It seems possible to make a modified vignette function to do
this using print.vignette,
but having started down this road, I got the strong sensation of reinventing
the wheel and the
inevitably related sensation that I wasn't going to know what to call my new
wheel when it was
created.  I recognize that the current setup is supposed to encourage proper
vignettes, but sometimes
courage fails.

A related questions is whether there is a convenient substitute for a
package specific function like this:
FAQ <- function (pkg = "quantreg") 
  file.show(file.path(system.file(package = pkg), "FAQ"))

to read the faq that I've written for the package and placed in the inst/
directory.

-- 
View this message in context: 
http://www.nabble.com/Viewing-pdfs-from-inst-doc-tp25302477p25302477.html
Sent from the R devel mailing list archive at Nabble.com.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Viewing pdfs from inst/doc

2009-09-04 Thread Barry Rowlingson
On Fri, Sep 4, 2009 at 10:44 PM, rudjer wrote:
>
> Writing R extensions says:
>
> In addition to the help files in Rd format, R packages allow the inclusion
> of documents in arbitrary other formats. The standard location for these is
> subdirectory inst/doc of a source package, the contents will be copied to
> subdirectory doc when the package is installed. Pointers from package help
> indices to the installed documents are automatically created. Documents in
> inst/doc can be in arbitrary format, however we strongly recommend to
> provide them in PDF format, such that users on all platforms can easily read
> them.
>
> My question is easily How?  The function vignette() provides a convenient
> way to read properly Sweaved
> vignettes, but what about plain old pdfs that someone like me would like to
> stick into inst/doc and
> then view?  It seems possible to make a modified vignette function to do
> this using print.vignette,
> but having started down this road, I got the strong sensation of reinventing
> the wheel and the
> inevitably related sensation that I wasn't going to know what to call my new
> wheel when it was
> created.  I recognize that the current setup is supposed to encourage proper
> vignettes, but sometimes
> courage fails.
>
> A related questions is whether there is a convenient substitute for a
> package specific function like this:
>    FAQ <- function (pkg = "quantreg")
>          file.show(file.path(system.file(package = pkg), "FAQ"))
>
> to read the faq that I've written for the package and placed in the inst/
> directory.

 I think I tried to do this a little while ago, and trawling R-help
and R-dev came up with the suggestion of putting something in a demo
section. Hence I have foo/demo/bar.R which is:

pdf = system.file("doc/bar.pdf",package="foo")

if (.Platform$OS.type == "windows") {
  shell.exec(pdf)
}else{
  system(paste(shQuote(getOption("pdfviewer")), shQuote(pdf)),
wait = FALSE)
}

 Then when a user does demo(bar) the PDF pops up. I document this in
the Rd doc for bar.

 It does seem a bit kludgy, and maybe there's a need for a package to
handle all this...

Barry

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Viewing pdfs from inst/doc

2009-09-04 Thread Romain Francois

On 09/04/2009 11:58 PM, Barry Rowlingson wrote:


On Fri, Sep 4, 2009 at 10:44 PM, rudjer  wrote:


Writing R extensions says:

In addition to the help files in Rd format, R packages allow the inclusion
of documents in arbitrary other formats. The standard location for these is
subdirectory inst/doc of a source package, the contents will be copied to
subdirectory doc when the package is installed. Pointers from package help
indices to the installed documents are automatically created. Documents in
inst/doc can be in arbitrary format, however we strongly recommend to
provide them in PDF format, such that users on all platforms can easily read
them.

My question is easily How?  The function vignette() provides a convenient
way to read properly Sweaved
vignettes, but what about plain old pdfs that someone like me would like to
stick into inst/doc and
then view?  It seems possible to make a modified vignette function to do
this using print.vignette,
but having started down this road, I got the strong sensation of reinventing
the wheel and the
inevitably related sensation that I wasn't going to know what to call my new
wheel when it was
created.  I recognize that the current setup is supposed to encourage proper
vignettes, but sometimes
courage fails.

A related questions is whether there is a convenient substitute for a
package specific function like this:
FAQ<- function (pkg = "quantreg")
  file.show(file.path(system.file(package = pkg), "FAQ"))

to read the faq that I've written for the package and placed in the inst/
directory.


  I think I tried to do this a little while ago, and trawling R-help
and R-dev came up with the suggestion of putting something in a demo
section. Hence I have foo/demo/bar.R which is:

pdf = system.file("doc/bar.pdf",package="foo")

if (.Platform$OS.type == "windows") {
   shell.exec(pdf)
}else{
   system(paste(shQuote(getOption("pdfviewer")), shQuote(pdf)),
 wait = FALSE)
}

  Then when a user does demo(bar) the PDF pops up. I document this in
the Rd doc for bar.

  It does seem a bit kludgy, and maybe there's a need for a package to
handle all this...

Barry


print.vignette knows that it needs to factor out the "open a pdf file" 
part, see the  section :


> utils:::print.vignette
function(x, ...){

if(length(x$pdf)){
## 
## Should really abstract this into a BioC style
## openPDF() along the lines of browseURL() ...
if(.Platform$OS.type == "windows")
shell.exec(x$pdf)
else
system(paste(shQuote(getOption("pdfviewer")), shQuote(x$pdf)),
   wait = FALSE)
## 
} else {
warning(gettextf("vignette '%s' has no PDF", x$topic),
call. = FALSE, domain = NA)
}
invisible(x)
}



You can trick the system using something like this :

fake.vignette <- function( file = system.file( "doc", vignette, package 
= package ), vignette, package, topic = vignette ){

structure( pdf = file, topic = topic, class = "vignette" )
}

Romain

--
Romain Francois
Professional R Enthusiast
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
|- http://tr.im/xMdt : update on the ant package
|- http://tr.im/xHLs : R capable version of ant
`- http://tr.im/xHiZ : Tip: get java home from R with rJava

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel