Re: [Rd] build time dependency

2009-09-28 Thread Romain Francois

On 09/27/2009 08:27 PM, Uwe Ligges wrote:




Romain Francois wrote:

Hello,

Is there such thing as a build time dependency between packages :
package B needs package A so that it can build, but once built it
lives without it.



Do you mean at a) "R CMD build" time or at b) "R CMD INSTALL" time?


I meant R CMD INSTALL (but probably also R CMD build --binary), I need 
to check the code to see where this overlaps.




For a), you probably do not need to declare it in DESCRIPTION at all
(untested).
For b), you need to declare it. I feel uncomfortable to change the
Depends field between source and binary version of a package. At least,
it is not documented to work and if it works (have you tested that?), it
might not work in some future release of R.
But since you gave at least 2 reasonable examples for a
INSTALL-time-only dependency, you might want to contribute some patches
to handle such a thing.


Sure. I'll have a look to add a "InstallDependencies" (or whatever other 
spelling) field along the lines of Depends, but for INSTALL time.


Romain



Best wishes,
Uwe



For example depending on the ant package to compile java code, or
depend on roxygen to roxygenize the code, ...

Adding roxygen to Depends works, but then the installed package does
not depend on it and therefore loads it for nothing. Maybe I can
remove roxygen from the Depends as part of the ./configure[.win] ...

Romain



--
Romain Francois
Professional R Enthusiast
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
|- http://tr.im/ztCu : RGG #158:161: examples of package IDPmisc
|- http://tr.im/yw8E : New R package : sos
`- http://tr.im/y8y0 : search the graph gallery from R

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] model.matrix troubles with AlgDesign

2009-09-28 Thread Ulrike Groemping

Dear DevelopeRs,

in continuing with my suite of packages on experimental design, I am stuck
with an issue that appears to be related to package AlgDesign - I have tried
to get it solved by Bob Wheeler, but he seems to be stuck as well. 

Whenever AlgDesign is loaded, some of my code does not work any more. For
example, in a fresh R session: 

require(DoE.base)
fac.design(nlevels=c(2,6,2))
require(AlgDesign)
fac.design(nlevels=c(2,6,2))
> Error in nrow(aus) : object 'aus' not found 

The reason seems to be that AlgDesign creates a function
model.matrix.formula that only finds variables that are in the global
environment and variables that are in the data frame given with the formula,
but not calculation results from the intermediate calling environment. 

Results from traceback():
9: nrow(aus)
8: eval(expr, envir, enclos)
7: eval(predvars, data, env)
6: model.frame.default(object, data, xlev = xlev)
5: model.frame(object, data, xlev = xlev)
4: model.matrix.default(frml, data, ...)
3: model.matrix.formula(1:nrow(aus) ~ ., data = aus)
2: model.matrix(1:nrow(aus) ~ ., data = aus)
1: fac.design(nlevels = c(2, 6, 2))

If I reset model.matrix.formula to model.matrix.default, the problem
disappears (but AlgDesign's comfort functions for squares etc. do not work
any longer). In this particular case, I can also avoid the issue by
modifying the formula in fac.design, removing the left-hand side. But this
just means to wait for the next place where troubles occur. Between 3 and 4
of the traceback(), AlgDesign's function model.matrix.formula modifies the
formula frml using AlgDesign's function expand.formula:

model.matrix.formula <- function (frml, data = sys.frame(sys.parent()), ...) 
{
if (!missing(data)) {
if (!inherits(data, "data.frame")) 
stop("data must be a data.frame")
if (!inherits(frml, "formula")) 
stop("frml must be a formuls")
frml <- expand.formula(frml, colnames(data))
}
model.matrix.default(frml, data, ...)
}


I have looked at expand.formula as well, and I've been wondering whether a
simple fix can be found by adding environment information (which?) within
that function (I believe that the relevant portion of the code is included
below):

expand.formula <- function (frml, varNames, const = TRUE, numerics = NULL) 
{
## omitted quite a bit of code 
##...
frml <- deparse(frml, width = 500)
while ((0 != (pos <- findFunction("quad", frml))[1]) || (0 != 
(pos <- findFunction("cubicS", frml))[1]) || (0 != (pos <-
findFunction("cubic", 
frml))[1])) {
prog <- substr(frml, pos[1], pos[2])
strHead <- substr(frml, 1, pos[1] - 1)
strTail <- substr(frml, pos[2] + 1, nchar(frml))
prog <- eval(parse(text = prog))
frml <- paste(strHead, prog, strTail, sep = "")
}
if (0 != (pos <- findDot(".", frml))[1]) {
strHead <- substr(frml, 1, pos[1] - 1)
strTail <- substr(frml, pos[2] + 1, nchar(frml))
prog <- eval(parse(text = "doDot()"))
frml <- paste(strHead, prog, strTail, sep = "")
}
if (!const) 
frml <- paste(frml, "+0", sep = "")
frml <- as.formula(frml)
frml
}

Any help would be greatly appreciated.

Regards, Ulrike

-- 
View this message in context: 
http://www.nabble.com/model.matrix-troubles-with-AlgDesign-tp25641680p25641680.html
Sent from the R devel mailing list archive at Nabble.com.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Windows Laptop specification query

2009-09-28 Thread Prof Brian Ripley

On Mon, 28 Sep 2009, Sean O'Riordain wrote:


Good morning Keith,

Have a look at
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021

The short answer is that "it depends"...
a) memory is limited under windows


Yes, but 64-bit builds can be used on Windows -- that needs commercial 
compilers and there are commercial vendors of such builds.


Even with the CRAN binary, a 64-bit version of Windows offers double 
the memory over a (vanilla) 32-bit version.



b) R is essentially a serial program - HOWEVER it depends what you're
actually doing - if you're working with large matrices then there are
parallel versions of BLAS that can be used...  On a multi-core windows
machine with lots of memory you can of course run up multiple copies of R
and run each independently


There are several packages that parallelize their computations with 
MPI etc, and others that help with parallelization (papply, foreach, 
gputools, ).  And apart from Rmpi/rpvm/snow there is also 
'multicore', but not on Windows.  See the R-sig-hpc list for follow up 
on such issues.


As for Vista vs Windows 7, this is not the right list but Windows 7 
behaves just like a version of Vista as far as we have explored it 
(and the current rw-FAQ includes it and Server 2008 in the Vista 
section).


Many of us have bought dual quad-core servers in the last year or so: 
that includes Uwe Ligges' winbuilder machine.  I suspect most of the 
usage is separate R jobs running simultaneously: certainly that is the 
case in my dept (where there are at least 6 8-core servers running R 
jobs).




Kind regards,
Sean

On Mon, Sep 28, 2009 at 4:40 AM, Keith Satterley  wrote:


I've read some postings back in 2002/2006 about running R on multiple core
CPUs. The answer was basically separate processes work fine, but
parallelization needs to be implemented using snow/rmpi. Are the answers
still the same?

I ask because we are about to order a laptop running Windows for a new
staff member. Some advice on the following would be helpful.
It will be ordered with Vista, with a free upgrade to Windows 7. It will
have 8GB of memory

A quad core CPU costs about AUD$1100 more than the fastest (Intel T9900-6M
Cache, 3.06 GHz) dual core CPU.
I'm wondering if there is value in ordering the quad core. We are looking
at a time frame of 3-4 years.

Is anyone aware of near future plans to implement some form or
parallelization that would more or less be hidden from the normal user?

It is anticipated that analysis of Next Gen sequence data will be
important.

I've read the Windows FAQ about running R under Vista. We will probably
start with Vista. I've read some posts in R-devel indicating people are
running R under Windows 7. Is it safe to assume that R will run under
Windows 7 after it is released?

We are hoping to make use the 8GB of memory. Am I right in assuming that
when the 64 bit version of Windows 7 is available, it will allow R users to
make good use of the 8GB of memory. Does this happen under the current
higher end versions of 64 bit Vista?

cheers,

Keith


Keith Satterley
Bioinformatics Division
The Walter and Eliza Hall Institute of Medical Research
Parkville, Melbourne,
Victoria, Australia

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] mboost_1.1-3 blackboost_fit (PR#13972)

2009-09-28 Thread Torsten Hothorn



On Sun, 27 Sep 2009, Uwe Ligges wrote:

Please read the FAQs about submitting bug reports. Bugs in contributed 
packages must not go to the R-bugs repository but to the corresponding 
package maintainer, CCing in this case (I do not confirmed that it is a bug).




Ivan,

  bd <- party:::ctreedpp(y ~ ., data = dt, weights = NULL)

will do the job, however, not very nicely. mboost version 2.0-0 
(available from R-forge) can do better.


Best wishes,

Torsten




Best,
Uwe Ligges




bulls...@mail.ru wrote:

Full_Name: Ivan the Terrible
Version: 2.9.2
OS: Windows XP SP3
Submission from: (NULL) (89.110.13.151)


When using the method blackboost_fit of the package mboost appear following
error :
Error in party:::get_variables(o...@responses) :   trying to get slot 
"responses" from an object (class "boost_data") that is not

an S4 object

Simple test case that produce bug:

dt=expand.grid(y=c(2,3,4), x1=c(1,2), x2=c(1,2))
library(mboost)
bd=boost_dpp(y ~ .,data=dt, weights = NULL)
blackboost_fit(bd,tree_controls = ctree_control(
 teststat = "max",
 testtype = 
"Teststatistic",

 mincriterion = 0,
 maxdepth = 2
),
   fitmem = ctree_memory(
 bd, 
TRUE

),family = GaussReg(),
   control = boost_control(
   mstop = 2
  ),weights = NULL
  )


Test case session on my computer:


dt=expand.grid(y=c(2,3,4), x1=c(1,2), x2=c(1,2))
library(mboost)

Loading required package: modeltools
Loading required package: stats4
Loading required package: party
Loading required package: survival
Loading required package: splines
Loading required package: grid
Loading required package: coin
Loading required package: mvtnorm
Loading required package: zoo

Attaching package: 'zoo'


The following object(s) are masked from package:base :

 as.Date.numeric 
Loading required package: sandwich

Loading required package: strucchange
Loading required package: vcd
Loading required package: MASS
Loading required package: colorspace

bd=boost_dpp(y ~ .,data=dt, weights = NULL)
blackboost_fit(bd, 

+tree_controls = ctree_control(
+  teststat = "max",
+  testtype = 
"Teststatistic",

+  mincriterion = 0,
+  maxdepth = 2
+ ),
+fitmem = ctree_memory(
+  bd, + 
TRUE
+ ), +family = 
GaussReg(),

+control = boost_control(
+mstop = 2
+   ), +weights = NULL
+   )
Error in party:::get_variables(o...@responses) :   trying to get slot 
"responses" from an object (class "boost_data") that is not
an S4 object 

sessionInfo()
R version 2.9.2 (2009-08-24) i386-pc-mingw32 
locale:

LC_COLLATE=Russian_Russia.1251;LC_CTYPE=Russian_Russia.1251;LC_MONETARY=Russian_Russia.1251;LC_NUMERIC=C;LC_TIME=Russian_Russia.1251

attached base packages:
[1] grid  splines   stats graphics  grDevices utils datasets 
methods   base 
other attached packages:
 [1] mboost_1.1-3  party_0.9-999 vcd_1.2-4 colorspace_1.0-1 
MASS_7.2-48   strucchange_1.3-7
 [7] sandwich_2.2-1zoo_1.5-8 coin_1.0-6mvtnorm_0.9-7 
survival_2.35-4   modeltools_0.2-16


loaded via a namespace (and not attached):
[1] lattice_0.17-25 stats4_2.9.2 
__

R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel





__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Starting values in “arima.sim” f unction

2009-09-28 Thread Lina Rusyte
Hello, 
  
Could someone tell me please how can I find out which starting values has R 
used for the simulation? 
  
I have AR(1) model: 
  
y(t)=0.2*y(t-1)+0.2*y(t-2) + e(t)   
  
(e(t) is distributed according standard normal distribution) 
  
I need y(0) (or y(t-1), then t=1) values for my following calculations (it is 
very important parameter). 
Should I assume that y(0)=mean(yt) or set y(0)=0? 
  
How to find out, which values R uses for y(0), y(-1) and so on? 
  
Thank you very much for the answer in advance! 
  
Best regards, 
Lina


  
[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Windows Laptop specification query

2009-09-28 Thread Corrado
1) Would a package compiled for Windows 32bit run on Windows 64 bit and use 
the extended memory or not?

2) As for buying a performant laptop for R, I bought a Dell M6300 mobile 
workstation which is actually portable, and installed Kubuntu 904 64 bit 
alongside the standard windows installation. When I run R I only use it in 
Linux and access the data in Windows through the file system. If I need to run 
Office because some one else is sending me document to correct, I have 
installed 
Windows XP Pro SP3 in a virtual machine using Virtual Box, which runs very 
fairly on the M6300, and can switch it on and off whenever I need (booting on 
the virtual machine is matter of few seconds). This setup allows for running 
64 bit R on Linux (eventually compiled with -O3 -march=native by the way, if 
you feel like experimenting) which is more performant and used the memory more 
efficiently, without loosing the interacting with your windows based 
colleagues. 
The virtual machine can go full screen at the click of a mouse :D and it looks 
as if you were using a native Windows machine. You can install all software 
and network clients on the virtual machine. I have not booted Windows for ages 
 I have been using this machine fort he last 18 months. The dual core 
works great (I chose the top processor to run simulations when I am not in the 
office), and in Linux you can control the CPU frequency. The new one which 
substitutes the M6300 is the M6400 and I would go for that possibly (Linux 
supported):

http://www1.euro.dell.com/uk/en/business/Laptops/workstation-precision-m6400-
cov/pd.aspx?refid=workstation-precision-m6400-cov&s=bsd&cs=ukbsdt1


PS: I apologise for the question on memory management but I have never used R 
on Windows but some free spirit decided to release a package only for Windows 
and only pre compiled (no sources) and I need to use it to compare  (Sorry 
for the harsh comment and the rant , but I am not sure it is really fair to 
use an open source packages and programming languages for you daily work and 
make money out of it, and the first time you release something you release it 
crappy and closed source  even if it is legal and allowed of course  :
( )

On Monday 28 September 2009 09:16:23 Prof Brian Ripley wrote:
> On Mon, 28 Sep 2009, Sean O'Riordain wrote:
> > Good morning Keith,
> >
> > Have a look at
> > http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-
> >a-limit-on-the-memory-it-uses_0021
> >
> > The short answer is that "it depends"...
> > a) memory is limited under windows
>
> Yes, but 64-bit builds can be used on Windows -- that needs commercial
> compilers and there are commercial vendors of such builds.
>
> Even with the CRAN binary, a 64-bit version of Windows offers double
> the memory over a (vanilla) 32-bit version.
>
> > b) R is essentially a serial program - HOWEVER it depends what you're
> > actually doing - if you're working with large matrices then there are
> > parallel versions of BLAS that can be used...  On a multi-core windows
> > machine with lots of memory you can of course run up multiple copies of R
> > and run each independently
>
> There are several packages that parallelize their computations with
> MPI etc, and others that help with parallelization (papply, foreach,
> gputools, ).  And apart from Rmpi/rpvm/snow there is also
> 'multicore', but not on Windows.  See the R-sig-hpc list for follow up
> on such issues.
>
> As for Vista vs Windows 7, this is not the right list but Windows 7
> behaves just like a version of Vista as far as we have explored it
> (and the current rw-FAQ includes it and Server 2008 in the Vista
> section).
>
> Many of us have bought dual quad-core servers in the last year or so:
> that includes Uwe Ligges' winbuilder machine.  I suspect most of the
> usage is separate R jobs running simultaneously: certainly that is the
> case in my dept (where there are at least 6 8-core servers running R
> jobs).
>
> > Kind regards,
> > Sean
> >
> > On Mon, Sep 28, 2009 at 4:40 AM, Keith Satterley  
wrote:
> >> I've read some postings back in 2002/2006 about running R on multiple
> >> core CPUs. The answer was basically separate processes work fine, but
> >> parallelization needs to be implemented using snow/rmpi. Are the answers
> >> still the same?
> >>
> >> I ask because we are about to order a laptop running Windows for a new
> >> staff member. Some advice on the following would be helpful.
> >> It will be ordered with Vista, with a free upgrade to Windows 7. It will
> >> have 8GB of memory
> >>
> >> A quad core CPU costs about AUD$1100 more than the fastest (Intel
> >> T9900-6M Cache, 3.06 GHz) dual core CPU.
> >> I'm wondering if there is value in ordering the quad core. We are
> >> looking at a time frame of 3-4 years.
> >>
> >> Is anyone aware of near future plans to implement some form or
> >> parallelization that would more or less be hidden from the normal user?
> >>
> >> It is anticipat

Re: [Rd] Windows Laptop specification query

2009-09-28 Thread Prof Brian Ripley

The answer to (1) is in the rw-FAQ, so see

library(fortunes)
fortune('WTFM')

On Mon, 28 Sep 2009, Corrado wrote:


1) Would a package compiled for Windows 32bit run on Windows 64 bit and use
the extended memory or not?

2) As for buying a performant laptop for R, I bought a Dell M6300 mobile
workstation which is actually portable, and installed Kubuntu 904 64 bit
alongside the standard windows installation. When I run R I only use it in
Linux and access the data in Windows through the file system. If I need to run
Office because some one else is sending me document to correct, I have installed
Windows XP Pro SP3 in a virtual machine using Virtual Box, which runs very
fairly on the M6300, and can switch it on and off whenever I need (booting on
the virtual machine is matter of few seconds). This setup allows for running
64 bit R on Linux (eventually compiled with -O3 -march=native by the way, if
you feel like experimenting) which is more performant and used the memory more
efficiently, without loosing the interacting with your windows based colleagues.
The virtual machine can go full screen at the click of a mouse :D and it looks
as if you were using a native Windows machine. You can install all software
and network clients on the virtual machine. I have not booted Windows for ages
 I have been using this machine fort he last 18 months. The dual core
works great (I chose the top processor to run simulations when I am not in the
office), and in Linux you can control the CPU frequency. The new one which
substitutes the M6300 is the M6400 and I would go for that possibly (Linux
supported):

http://www1.euro.dell.com/uk/en/business/Laptops/workstation-precision-m6400-
cov/pd.aspx?refid=workstation-precision-m6400-cov&s=bsd&cs=ukbsdt1


PS: I apologise for the question on memory management but I have never used R
on Windows but some free spirit decided to release a package only for Windows
and only pre compiled (no sources) and I need to use it to compare  (Sorry
for the harsh comment and the rant , but I am not sure it is really fair to
use an open source packages and programming languages for you daily work and
make money out of it, and the first time you release something you release it
crappy and closed source  even if it is legal and allowed of course  :
( )

On Monday 28 September 2009 09:16:23 Prof Brian Ripley wrote:

On Mon, 28 Sep 2009, Sean O'Riordain wrote:

Good morning Keith,

Have a look at
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-
a-limit-on-the-memory-it-uses_0021

The short answer is that "it depends"...
a) memory is limited under windows


Yes, but 64-bit builds can be used on Windows -- that needs commercial
compilers and there are commercial vendors of such builds.

Even with the CRAN binary, a 64-bit version of Windows offers double
the memory over a (vanilla) 32-bit version.


b) R is essentially a serial program - HOWEVER it depends what you're
actually doing - if you're working with large matrices then there are
parallel versions of BLAS that can be used...  On a multi-core windows
machine with lots of memory you can of course run up multiple copies of R
and run each independently


There are several packages that parallelize their computations with
MPI etc, and others that help with parallelization (papply, foreach,
gputools, ).  And apart from Rmpi/rpvm/snow there is also
'multicore', but not on Windows.  See the R-sig-hpc list for follow up
on such issues.

As for Vista vs Windows 7, this is not the right list but Windows 7
behaves just like a version of Vista as far as we have explored it
(and the current rw-FAQ includes it and Server 2008 in the Vista
section).

Many of us have bought dual quad-core servers in the last year or so:
that includes Uwe Ligges' winbuilder machine.  I suspect most of the
usage is separate R jobs running simultaneously: certainly that is the
case in my dept (where there are at least 6 8-core servers running R
jobs).


Kind regards,
Sean

On Mon, Sep 28, 2009 at 4:40 AM, Keith Satterley 

wrote:

I've read some postings back in 2002/2006 about running R on multiple
core CPUs. The answer was basically separate processes work fine, but
parallelization needs to be implemented using snow/rmpi. Are the answers
still the same?

I ask because we are about to order a laptop running Windows for a new
staff member. Some advice on the following would be helpful.
It will be ordered with Vista, with a free upgrade to Windows 7. It will
have 8GB of memory

A quad core CPU costs about AUD$1100 more than the fastest (Intel
T9900-6M Cache, 3.06 GHz) dual core CPU.
I'm wondering if there is value in ordering the quad core. We are
looking at a time frame of 3-4 years.

Is anyone aware of near future plans to implement some form or
parallelization that would more or less be hidden from the normal user?

It is anticipated that analysis of Next Gen sequence data will be
important.

I've read the Windows FAQ about running

Re: [Rd] Windows Laptop specification query

2009-09-28 Thread Corrado
On Monday 28 September 2009 11:01:42 Prof Brian Ripley wrote:
> The answer to (1) is in the rw-FAQ, so see

Can you point me out exactly where? Since I did not find it.

-- 
Corrado Topi

Global Climate Change & Biodiversity Indicators
Area 18,Department of Biology
University of York, York, YO10 5YW, UK
Phone: + 44 (0) 1904 328645, E-mail: ct...@york.ac.uk

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] bug or feature in read.table?

2009-09-28 Thread Jens Oehlschlägel
Hi,

I guess that the followig line in read.table 
tmp[i[i > 0L]] <- colClasses
should read
tmp[i[i > 0L]] <- colClasses[i > 0L]

Is this a bug?

Cheers
Jens Oehlschlägel
-- 

für nur 19,99 Euro/mtl.!* http://portal.gmx.net/de/go/dsl02

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] lattice's mai/mar analog

2009-09-28 Thread Michael Ramati
hello,
is there a way to control figure margins using package lattice, similarly to 
parameters mai/mar (which presumbly works only for figures of package graphics)?
thanks!‎

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] build time dependency

2009-09-28 Thread Uwe Ligges



Romain Francois wrote:

On 09/27/2009 08:27 PM, Uwe Ligges wrote:




Romain Francois wrote:

Hello,

Is there such thing as a build time dependency between packages :
package B needs package A so that it can build, but once built it
lives without it.



Do you mean at a) "R CMD build" time or at b) "R CMD INSTALL" time?


I meant R CMD INSTALL (but probably also R CMD build --binary), I need 
to check the code to see where this overlaps.


Are you sure you *really* need those dependencies at INSTALL rather than 
build time? I haven't looked closely into roxygen so far, but if it is 
designed to work at INSTALL time, the task is to move this to the build 
time of the package from my point of view.


Since the roxygen author, Manuel Eugster, is in Dortmund today, I will 
discuss this point.


Best,
Uwe Ligges








For a), you probably do not need to declare it in DESCRIPTION at all
(untested).
For b), you need to declare it. I feel uncomfortable to change the
Depends field between source and binary version of a package. At least,
it is not documented to work and if it works (have you tested that?), it
might not work in some future release of R.
But since you gave at least 2 reasonable examples for a
INSTALL-time-only dependency, you might want to contribute some patches
to handle such a thing.


Sure. I'll have a look to add a "InstallDependencies" (or whatever other 
spelling) field along the lines of Depends, but for INSTALL time.


Romain



Best wishes,
Uwe



For example depending on the ant package to compile java code, or
depend on roxygen to roxygenize the code, ...

Adding roxygen to Depends works, but then the installed package does
not depend on it and therefore loads it for nothing. Maybe I can
remove roxygen from the Depends as part of the ./configure[.win] ...

Romain





__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] build time dependency

2009-09-28 Thread Romain Francois

On 09/28/2009 04:39 PM, Uwe Ligges wrote:

Romain Francois wrote:

On 09/27/2009 08:27 PM, Uwe Ligges wrote:


Romain Francois wrote:

Hello,

Is there such thing as a build time dependency between packages :
package B needs package A so that it can build, but once built it
lives without it.



Do you mean at a) "R CMD build" time or at b) "R CMD INSTALL" time?


I meant R CMD INSTALL (but probably also R CMD build --binary), I need
to check the code to see where this overlaps.


Are you sure you *really* need those dependencies at INSTALL rather than
build time? I haven't looked closely into roxygen so far, but if it is
designed to work at INSTALL time, the task is to move this to the build
time of the package from my point of view.

Since the roxygen author, Manuel Eugster, is in Dortmund today, I will
discuss this point.


Hi Uwe,

I think you are supposed to do this kind of sequence:

R CMD roxygen yourRoxygenablePackage
R CMD build yourRoxygenablePackage_roxygen

... but I don't like this because what you upload to cran is not the 
actual source but somethingalready pre-processed. (This also applies to 
packages shipping java code, most people just compile the java code on 
their machine and only supply a jar of compiled code, but that's another 
story I suppose ...)


I'd prefer the roxygenation to be part of the standard build/INSTALL 
system, so my plan is to write configure and configure.win which would 
call roxygenize to generate Rd.


I am not aware of any hook similar to configure[.win] you can use so 
that "R CMD build" does something extra. I probably just missed it.


Romain



Best,
Uwe Ligges








For a), you probably do not need to declare it in DESCRIPTION at all
(untested).
For b), you need to declare it. I feel uncomfortable to change the
Depends field between source and binary version of a package. At least,
it is not documented to work and if it works (have you tested that?), it
might not work in some future release of R.
But since you gave at least 2 reasonable examples for a
INSTALL-time-only dependency, you might want to contribute some patches
to handle such a thing.


Sure. I'll have a look to add a "InstallDependencies" (or whatever
other spelling) field along the lines of Depends, but for INSTALL time.

Romain



Best wishes,
Uwe



For example depending on the ant package to compile java code, or
depend on roxygen to roxygenize the code, ...

Adding roxygen to Depends works, but then the installed package does
not depend on it and therefore loads it for nothing. Maybe I can
remove roxygen from the Depends as part of the ./configure[.win] ...

Romain









--
Romain Francois
Professional R Enthusiast
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr
|- http://tr.im/ztCu : RGG #158:161: examples of package IDPmisc
|- http://tr.im/yw8E : New R package : sos
`- http://tr.im/y8y0 : search the graph gallery from R

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Downloading R-2.10.0

2009-09-28 Thread Gabor Grothendieck
For Windows, this page

http://cran.r-project.org/bin/windows/base/

gives a link to download

- R 2.9.2
- r-patched (R 2.9.2 patched)
- old releases and
- r-devel (R 2.11.0)

but there is no obvious link to R 2.10.0.  From where do we download that?

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] documentation cross references under R 2.10.0dev for Windows

2009-09-28 Thread Gordon K Smyth
With one expection, all warnings go away when I download the relevant 
Bioconductor packages as source code and re-build them (rcmd INSTALL 
--build) on my own machine.


The warnings re-appear if I install the Bioconductor packages in the 
normal way using biocLite("Biobase") etc.  I will follow this up with the 
Bioconductor people.


The one exception is the self-reference to limma:00Index.  This marked as 
a missing link, under Windows only, although it works fine.


Gordon

On Mon, 28 Sep 2009, Gordon K Smyth wrote:

Rcmd check under R 2.10.0dev for Windows seems to be issuing a number of 
spurious warning messages about Rd cross-references.


The following warning messages appear when checking the latest (non-public) 
version of the Bioconductor package limma.  They appear only under Windows, 
not Unix or Mac.  All the flagged links appear to be ok, in that they 
specific a genuine html file, and should therefore not be marked as suspect 
or missing.


Regards
Gordon

* using R version 2.10.0 Under development (unstable) (2009-09-27 r49846)
* using session charset: ISO8859-1
* checking Rd cross-references ... WARNING
Missing link(s) in documentation object './man/01Introduction.Rd':
 '[limma:00Index]{LIMMA contents page}'

Suspect link(s) in documentation object './man/asmalist.Rd':
 '[marray:marrayNorm-class]{marrayNorm}'

Suspect link(s) in documentation object './man/asmatrix.Rd':
 '[Biobase]{exprs}'

Suspect link(s) in documentation object './man/dupcor.Rd':
 '[statmod]{mixedModel2Fit}'

Suspect link(s) in documentation object './man/EList.Rd':
 '[Biobase]{ExpressionSet-class}'

Suspect link(s) in documentation object './man/imageplot.Rd':
 '[marray]{maImage}'

Suspect link(s) in documentation object './man/intraspotCorrelation.Rd':
 '[statmod]{remlscore}'

Suspect link(s) in documentation object './man/limmaUsersGuide.Rd':
 '[Biobase]{openPDF}' '[Biobase]{openVignette}' '[base]{Sys.putenv}'

Suspect link(s) in documentation object './man/malist.Rd':
 '[marray:marrayNorm-class]{marrayNorm}'

Suspect link(s) in documentation object './man/normalizebetweenarrays.Rd':
 '[marray:maNormScale]{maNormScale}' '[affy:normalize]{normalize}'

Suspect link(s) in documentation object './man/normalizeWithinArrays.Rd':
 '[marray:maNorm]{maNorm}'

Suspect link(s) in documentation object './man/normexpfit.Rd':
 '[affy:bg.adjust]{bg.parameters}'

Suspect link(s) in documentation object './man/readgal.Rd':
 '[marray:read.Galfile]{read.Galfile}'

Suspect link(s) in documentation object './man/rglist.Rd':
 '[marray:marrayRaw-class]{marrayRaw}'



On Wed, 23 Sep 2009, Duncan Murdoch wrote:


On 23/09/2009 10:08 PM, Henrik Bengtsson wrote:

Hi,

in 'Writing R Extensions" of R v2.10.0, under Section
'Cross-references' (2009-09-07) it says:

1. "The markup \link{foo} (usually in the combination
\code{\link{foo}}) produces a hyperlink to the help for foo. Here foo
is a topic, that is the argument of \alias markup in another Rd file
(possibly in another package)."

2. "You can specify a link to a different topic than its name by
\link[=dest]{name} which links to topic dest with name name. This can
be used to refer to the documentation for S3/4 classes, for example
\code{"\link[=abc-class]{abc}"} would be a way to refer to the
documentation of an S4 class "abc" defined in your package, and
\code{"\link[=terms.object]{terms}"} to the S3 "terms" class (in
package stats). To make these easy to read, \code{"\linkS4class{abc}"}
expands to the form given above."

3. "There are two other forms of optional argument specified as
\link[pkg]{foo} and \link[pkg:bar]{foo} to link to the package pkg, to
files foo.html and bar.html respectively. These are rarely needed,
perhaps to refer to not-yet-installed packages (but there the HTML
help system will resolve the link at run time) or in the normally
undesirable event that more than one package offers help on a topic20
(in which case the present package has precedence so this is only
needed to refer to other packages). They are only in used in HTML help
(and ignored for hyperlinks in LaTeX conversions of help pages), and
link to the file rather than the topic (since there is no way to know
which topics are in which files in an uninstalled package). The *only*
reason to use these forms for base and recommended packages is to
force a reference to a package that might be further down the search
path. Because they have been frequently misused, as from R 2.10.0 the
HTML help system will look for topic foo in package pkg if it does not
find file foo.html."


Trying to summarize the above, do we have the following markups/rules?

A. \link{} - where  must occur as an \alias{},
but not necessarily as an \name{}.  The link will be display as
the string .
B. \link[=]{} - where  must occur as an
\alias{} with a \name{}.  The link will be display as the
string .
C. \link{]{} - where  must be a \name{}
in a package named .  The link will be display as the
string .
D. \link{:]{} - where  must be a
\name

Re: [Rd] build time dependency

2009-09-28 Thread Seth Falcon
On Mon, Sep 28, 2009 at 11:25 AM, Romain Francois
 wrote:
> Hi Uwe,
>
> I think you are supposed to do this kind of sequence:
>
> R CMD roxygen yourRoxygenablePackage
> R CMD build yourRoxygenablePackage_roxygen
>
> ... but I don't like this because what you upload to cran is not the actual
> source but somethingalready pre-processed. (This also applies to packages
> shipping java code, most people just compile the java code on their machine
> and only supply a jar of compiled code, but that's another story I suppose
> ...)
>
> I'd prefer the roxygenation to be part of the standard build/INSTALL system,
> so my plan is to write configure and configure.win which would call
> roxygenize to generate Rd.

I can appreciate the desire to make the "true" sources available.  At
the same time, I think one should very carefully consider the expense
of external dependencies on a package.

One could view doc generation along the same lines as configure script
generation -- a compilation step that can be done once instead of by
all those who install and as a result reduce the depencency burden of
those wanting to install the package.  Configure scripts are almost
universally included pre-built in distribution source packages so that
users do not need to have the right version of autoconf/automake.

In other words, are you sure you want to require folks to install
roxygen (or whatever) in order to install your package? Making it easy
to do so is great, but in general if you can find a way to reduce
dependencies and have your package work, that is better. :-)

+ seth

-- 
Seth Falcon | @sfalcon | http://userprimary.net/user

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel