Re: [Rd] RPM support for package installation?

2007-02-07 Thread Martin Maechler
> "Rhiannon" == Rhiannon L Weaver <[EMAIL PROTECTED]>
> on Tue, 6 Feb 2007 14:35:31 -0500 (EST) writes:

Rhiannon> Hi,

Rhiannon> Thanks for the clarification.  As long as the
Rhiannon> admins don't mind (which I guess they won't
Rhiannon> because it means they won't have to build RPMs or
Rhiannon> binaries), I will be okay with just using local
Rhiannon> versions of the libraries.  

You will be using local version of the  ** packages **
by installing them into your own library.  Try to be careful not
to confuse the two terms.

Rhiannon> versions of the libraries.  I just wanted to make
Rhiannon> sure I wasn't missing something obvious (which is
Rhiannon> probably pretty likely in situations like this).
Rhiannon> Thanks again for your help.

Rhiannon> -Rhiannon


Rhiannon> On Tue, 6 Feb 2007, Prof Brian Ripley wrote:

>> The problem is the speed with which R packages change.
>> My dept considered this, and decided against.  There have
>> been something like 200 new versions of CRAN packages
>> already this year.
>> 
>> Even if we provided automated wrappers to make source
>> RPMs, someone would still have to build the binary RPMs
>> for your (unstated) architecture and then install it.
>> Unless you use very few packages nor sysadmin is going to
>> be happy with this approach.
>> 
>> It really is quite easy to have your own library and
>> install packages there, and it will become easier in
>> 2.5.0.  Your 'workaround' is the preferred solution for
>> many sites including ours, although for our most popular
>> architectures we also run a central site-library of
>> popular packages (e.g.  those used for teaching here).
>> 
>> 
>> On Tue, 6 Feb 2007, Rhiannon L Weaver wrote:
>> 
>>> Hello,
>>> 
>>> Tech question, I hope this has not been addressed
>>> before.  I searched help archives and looked for online
>>> help but came up empty-handed.
>>> 
>>> My question is: (short version) Is there a RPM-supported
>>> version of update.packages() for use with updating
>>> package libraries on managed multi-user Linux networks?
>>> 
>>> Details:
>>> 
>>> I put in a request for updating the version of R on one
>>> of the hosts on my work Unix network, which is managed
>>> by our IT department.  Current version is 2.1.0; I asked
>>> them to update to 2.4.1. The core update installed and I
>>> was able to test it, but the update had trouble loading
>>> the package "Matrix" for use with "lme4".  I don't
>>> recall the specific error (will check it out when the
>>> new version gets re-installed again and I can document
>>> it).  Other packages (lme, wavethresh, MASS) seemed to
>>> load without problems.
>>> 
>>> I think the Matrix problem can be solved by running
>>> update.packages() but when I requested the admin to
>>> update packages for the new version, they said that they
>>> need to do this via an RPM.  Specifically (and I'm not a
>>> network guru so my advice may not be entirely accurate):
>>> 
>>> me: I think if you have admin access you should be able
>>> to update the R packages by using the command
>>> update.packages() from within a running, updated version
>>> of R, and it will automatically check packages for new
>>> versions and update them.
>>> 
>>> admin: But this method moves us to an unsustainable host
>>> with locally installed packages.  The add-on packages
>>> need to be installed via an RPM.
>>> 
>>> As I understand it, RPM is like a kind of makefile for
>>> Linux machines.  The help mentions need of -devel or
>>> -dev files for RPM installations and updates of the core
>>> software; is there a similar avenue I can point my admin
>>> to for package updates?  I'm not afraid of a little
>>> Linux, but I fear I am a bit out of my element on this
>>> one.
>>> 
>>> Currently the workaround is for them to install the new
>>> version and for me to download and maintain packages
>>> locally.
>>> 
>>> Thanks very much for your time, -Rhiannon
>>> 
>>> __
>>> R-devel@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>> 
>> 
>> -- 
>> Brian D. Ripley, [EMAIL PROTECTED] Professor of
>> Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
>> University of Oxford, Tel: +44 1865 272861 (self) 1 South
>> Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax:
>> +44 1865 272595
>> 

Rhiannon> __
Rhiannon> R-devel@r-project.org mailing list
Rhiannon> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.

[Rd] what's the C code for La.svd function?

2007-02-07 Thread Lei Wu
Hi R gurus,

If I want to bring back the old La.svd in R-2.3.0, which C files I need to 
bring back? The current La.svd() runs much faster, but not robust enough. It 
failed on some datasets. I would like to bring back the old one even it's 
slower.

Thank you very much for any hint.

Lei

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] RPM support for package installation?

2007-02-07 Thread Martin Maechler
> "Jari" == Jari Oksanen <[EMAIL PROTECTED]>
> on Wed, 07 Feb 2007 16:28:14 +0200 writes:

Jari> On Wed, 2007-02-07 at 14:57 +0100, Martin Maechler wrote:
>> > "Rhiannon" == Rhiannon L Weaver <[EMAIL PROTECTED]>
>> > on Tue, 6 Feb 2007 14:35:31 -0500 (EST) writes:
>> 
Rhiannon> Hi,
>> 
Rhiannon> Thanks for the clarification.  As long as the
Rhiannon> admins don't mind (which I guess they won't
Rhiannon> because it means they won't have to build RPMs or
Rhiannon> binaries), I will be okay with just using local
Rhiannon> versions of the libraries.  
>> 
>> You will be using local version of the  ** packages **
>> by installing them into your own library.  Try to be careful not
>> to confuse the two terms.

Jari> This is what Wikipedia says:

Jari> "R is also highly extensible through the use of packages, which are
Jari> user-submitted libraries..."
Jari> (http://en.wikipedia.org/wiki/R_%28programming_language%29)

Jari> Time to correct Wikipedia?

Yes, please!  I don't have time currently...
Martin

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] manage R function and data dependencies like 'make'

2007-02-07 Thread Lutz Prechelt
Dear R-devels,

I am looking for a package (or some other infrastructure) for the
following situation:

I am doing a larger data evaluation.
I have several dozen data files and do not want to keep the persistent
data in the R workspace (for robustness reasons).
I have several dozen R files, some for reading and preprocessing data
files, others for doing plots or analyses.
I will make frequent changes to both data files and R files while doing
the analysis.

I would like to automate mechanisms that allow 
- a data file reading function to suppress its actual work if neither
the data file nor the R file containing the function were modified since
the data file was last read
- an R file sourcing function to suppress its actual work if the R file
has not been modified
- and perhaps even: automate re-reading a data file upon access to the
corresponding dataframe iff the file has been modified since the
dataframe was created.

In short: Something like Unix's 'make', but for managing dependencies of
functions and dataframes in addition to files. In R. (And of course I am
very open for solutions that are more elegant than what I have sketched
above.)

I could not find something in the help and have rather few ideas for
good search terms.

I any such thing available?
(If no such infrastructure exists, what is the right R function for
accessing file modification dates?)

Thanks!

  Lutz

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] manage R function and data dependencies like 'make'

2007-02-07 Thread Prof Brian Ripley
R-devel has file_test() in utils (earlier versions had a private version 
in tools).  That has a '-nt' op to do what you need.

file.info() accesses modification dates.

Having said that, I would use 'make' as for example R's own test suites 
do.

On Wed, 7 Feb 2007, Lutz Prechelt wrote:

> Dear R-devels,
>
> I am looking for a package (or some other infrastructure) for the
> following situation:
>
> I am doing a larger data evaluation.
> I have several dozen data files and do not want to keep the persistent
> data in the R workspace (for robustness reasons).
> I have several dozen R files, some for reading and preprocessing data
> files, others for doing plots or analyses.
> I will make frequent changes to both data files and R files while doing
> the analysis.
>
> I would like to automate mechanisms that allow
> - a data file reading function to suppress its actual work if neither
> the data file nor the R file containing the function were modified since
> the data file was last read
> - an R file sourcing function to suppress its actual work if the R file
> has not been modified
> - and perhaps even: automate re-reading a data file upon access to the
> corresponding dataframe iff the file has been modified since the
> dataframe was created.
>
> In short: Something like Unix's 'make', but for managing dependencies of
> functions and dataframes in addition to files. In R. (And of course I am
> very open for solutions that are more elegant than what I have sketched
> above.)
>
> I could not find something in the help and have rather few ideas for
> good search terms.
>
> I any such thing available?
> (If no such infrastructure exists, what is the right R function for
> accessing file modification dates?)
>
> Thanks!
>
>  Lutz
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] manage R function and data dependencies like 'make'

2007-02-07 Thread Paul Gilbert
I use make to do things like you describe. Here is an example target 
from one of my Makerules files:

$(T:%=compare.R$(REP).T%):compare.%: compare.R estimates.%
@echo making $(notdir $(PWD)) $@ because $? changed ...
@(cd  $(subst compare.,,$@) ; \
echo "z <- try( source('../compare.R')); \
  if (!inherits(z, 'try-error')) q('yes', status=0) else \
   {print(z); q('yes', status=1)} "  | \
  R --slave >../$(FLAGS)/[EMAIL PROTECTED] 2>&1)
@mv $(FLAGS)/[EMAIL PROTECTED] $(FLAGS)/$@


I realize this is out of context and possibly mangled by mail wrap, but 
if you are familiar with (GNU) make then it should give you a good idea 
what to do. In this example I am accumulating (intermediate) things in 
the .RData file and using flags as targets to indicate the status of 
different steps. Another possibility would be to rename the .RData files 
and use them as the targets. Let me know if you want a more complete 
example.

I really would encourage you to think of wrapping R (like a compiler) in 
make, rather than trying to re-implement something like make within R.

(I would be interested to see examples if anyone is using Ant to do this 
kind of thing.)

Paul


Lutz Prechelt wrote:
> Dear R-devels,
> 
> I am looking for a package (or some other infrastructure) for the
> following situation:
> 
> I am doing a larger data evaluation.
> I have several dozen data files and do not want to keep the persistent
> data in the R workspace (for robustness reasons).
> I have several dozen R files, some for reading and preprocessing data
> files, others for doing plots or analyses.
> I will make frequent changes to both data files and R files while doing
> the analysis.
> 
> I would like to automate mechanisms that allow 
> - a data file reading function to suppress its actual work if neither
> the data file nor the R file containing the function were modified since
> the data file was last read
> - an R file sourcing function to suppress its actual work if the R file
> has not been modified
> - and perhaps even: automate re-reading a data file upon access to the
> corresponding dataframe iff the file has been modified since the
> dataframe was created.
> 
> In short: Something like Unix's 'make', but for managing dependencies of
> functions and dataframes in addition to files. In R. (And of course I am
> very open for solutions that are more elegant than what I have sketched
> above.)
> 
> I could not find something in the help and have rather few ideas for
> good search terms.
> 
> I any such thing available?
> (If no such infrastructure exists, what is the right R function for
> accessing file modification dates?)
> 
> Thanks!
> 
>   Lutz
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel


La version française suit le texte anglais.



This email may contain privileged and/or confidential inform...{{dropped}}

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] xlsReadWrite Pro and embedding objects and files in Excel worksheets

2007-02-07 Thread Mark W Kimpel
Hans-Peter and other R developers,

How are you? Have you made any progess with embedding Url's in Excel?

Well, I have been busy thinking of more things for you to do;)

My colleagues in the lab are not R literate, and some are barely 
computer literate, so I give them everything in Excel workbooks. I have 
gradually evolved a system such that these workbooks have become 
compendia of my data, output, and methods. That, in fact, is why I 
bought the Pro version of xlsReadWritePro. I have been saving graphics 
as PDF files, then inserting them as object in Excel sheets.

What I would like to be able to do is to embed objects (files) in sheets 
of a workbook directly from within R. I would also like to be able to 
save my current R workspace as an object embedded in a sheet so that in 
the future, if packages change, I could go back and recreate the 
analysis. I do not need to be able to manuipulate files that R has not 
created, like a PDF file from another user. I would, however, like to be 
able to save my graphics as PDF files inside a worksheet, even if it 
meant creating a  temp file or something.

Before people begin talking about how MySQL or some other database could 
handle all that archiving, let me say that that is not what my 
colleagues want. They want a nice Excel file that they can take home on 
there laptops. One thing I like about worksheets is that they themselves 
can contain many embedded files, so it keeps our virtual desks neater 
and less confusing.

Hans, if you could do this, it would be of tremendous benefit to me and 
hopefully a lot of people. R developers tend to think that all 
scientists are running Linux on 64-bit computers, but most biomedical 
researches still store date in Excel files. This won't solve everybody's 
needs, but it could be a start.

Well, let me know what you think. I am cc'ing R-devel to see if any of 
those guys have ideas as well.

Thanks,
Mark



-- 
Mark W. Kimpel MD
Neuroinformatics
Department of Psychiatry
Indiana University School of Medicine

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel