Re: [Rd] code validation (was Re: NY Times article)

2009-01-11 Thread Philippe Grosjean

Hello,

To answer Spencer Graves questions, I would like to mention that there 
an alternative to RUnit that could easy writing of test units, 
regression tests and integration tests for R. It is svUnit. See 
http://www.sciviews.org/SciViews-K/index.html. It is not on CRAN yet, 
but on R-Forge because it is still in development. You can install it by:


install.packages("svUnit",repos="http://R-Forge.R-project.org";)

It is fully compatible with RUnit test units, that is, the checkXXX() 
functions, and .setUp()/.tearDown() are fully supported. Indeed, my 
first goal was to use RUnit and build a GUI on top of it... but for 
several reasons, it was not possible and I had to write my custom 
checkXXX() functions (same interface, but totally different internal code).


svUnit offers original ways (for R microcosm) to define, run and view 
your test suites. Here are some of them:


1) You do not need to write a test suite file on disk to test an R 
object. Suppose you have your own function like this:


> Square <- function(x) return(x^2)

You can simply attach a test unit to this object by:

> test(Square) <- function() {
+ checkEquals(9, Square(3))
+ checkEquals(10, Square(3))  # This intentionally fails
+ checkEquals(9, SSSquare(3)) # This intentionally raises error
+ checkEquals(c(1, 4, 9), Square(1:3))
+ checkException(Square("xx"))
+ }

And you run it as easily. You must first know that test unit failures, 
errors and other data concerning your tests are logged globally. You 
clean the log by clearLog() and you look at it by Log(), or perhaps 
summary(Log()) if you accumulated a lot of tests in the logger. So, to 
test your Square() function only, you do:


> clearLog()
> runTest(Square)
> Log()
= A svUnit test suite run in less than 0.1 sec with:

* test(Square) ... **ERROR**


== test(Square) run in less than 0.1 sec: **ERROR**

//Pass: 3 Fail: 1 Errors: 1//

* : checkEquals(10, Square(3)) run in 0.002 sec ... **FAILS**
Mean relative difference: 0.1
 num 9

* : checkEquals(9, SSSquare(3)) run in less than 0.001 sec ... **ERROR**
Error in mode(current) : could not find function "SSSquare"

2) You can attach test units to any kind of objects, and even, you can 
define stand-alone tests (like integration or regression tests, for 
instance):


> test_Integrate <- svTest(function() {
+ checkTrue(1 < 2, "check1")
+ v <- 1:3# The reference
+ w <- 1:3# The value to compare to the reference
+ checkEquals(v, w)
+ })

then:

> runTest(test_Integrate)
> Log() # Note how test results accumulate in the logger
(output not shown)

3) On the contrary to RUnit, you can even run the checkXXX() functions 
everywhere, and their test results will also accumulate in the logger 
(but then, there is no context associated to the test and the title is 
just "eval run in ... ":


> checkTrue(1 < 2)
> Log()
(output not shown)

4) You have convenient functions to catalog all available test 
units/test suites (in R packages, in objects in memory, integration 
tests, ...). You even can manage exclusion lists (by default, test 
suites defined in svXXX packages and RUnit are excluded). So:


> (oex <- getOption("svUnit.excludeList")) # Excluded items (regexp)
[1] "package:sv""package:RUnit"
> # clear the exclusion list and list all available test units
> options(svUnit.excludeList = NULL)
> svSuiteList() # Note that our test functions are also listed
A svUnit test suite definition with:

- Test suites:
[1] "package:svUnit""package:svUnit (VirtualClass)"

- Test functions:
[1] "test(Square)"   "test_Integrate"

> # Restore previous exclusion list
> options(svUnit.excludeList = oex)

Lokk at ?svSuite for more explanations.

5) You can easily transform a test associated with an object into a 
RUnit-compatible test file on disk:


> unit <- makeUnit(Square)
> file.show(unit, delete.file = TRUE)

6) You can easily integrate the svUnit test in R packages and in R CMD 
check mechanism with silent full test of your units in case of no errors 
or failures, but break of R CMD check with extensive report in case of 
problems. Just define a .Rd page (named 'unitTests.mypkg', by default) 
and write an example section to run the test units you want. Here is an 
example:


-
\name{unitTests}
\alias{unitTests.svUnit}

\title{ Unit tests for the package svUnit }
\description{
  Performs unit tests defined in this package by running
  \code{example(unitTests.svUnit)}. Tests are in \code{runit*.R} files
  located in the '/unitTests' subdirectory or one of its subdirectories
  ('/inst/unitTests' and subdirectories in package sources).
}

\author{Me (\email...@mysite.org})}

\examples{
library(svUnit)
# Make sure to clear log of errors and failures first
clearLog()

# Run all test units defined in the 'svUnit' package
(runTest(svSuite("package:svUnit"), "svUnit"))

\donttest{
# Tests to run with example() but not with R CMD check
# Run all te

Re: [Rd] Problem with compiling shared C/C++ library for loading into R (Linux)

2009-01-11 Thread torpedo fisken
I think you're not linking it together when you call R CMD SHLIB, use
something like

R CMD SHLIB mylib.c yourlib1.o yourlib2.o

Otherwise mail again with your Makefile, or all the commands you are using

good luck

2009/1/11 Samsiddhi Bhattacharjee :
> Dear all,
>
> I am using the .Call interface to call c++ code from R. For that, I am
> trying to create a dynamic library (mylib.so)
> using "R CMD SHLIB" by linking my own c++ code and an external c++
> library (blitz++).
>
> The makefile works fine on my Mac, produces mylib.so and I am able to
> call .Call() from R,  but on a linux
> server (I think Debian),  I got the following error:
>
> --
> /usr/bin/ld: /somepath/blitz/lib/libblitz.a(globals.o):
> relocation R_X86_64_32 against `a local symbol' can not be used when
> making a shared object; recompile with -fPIC
> /somepath/blitz/lib/libblitz.a: could not read symbols: Bad value
> collect2: ld returned 1 exit status
> --
>
> I tried recompiling the blitz++ library with the -fPIC flag, and then
> linking using -fPIC, it went thorugh without error
> producing a "mylib.so" file.  But when I tried "dyn.load(mylib.so)"
> from R, I got the error:
>
> --
> Error in dyn.load("mylib.so") :
>  unable to load shared library '/somepath/mylib.so':
>  /somepath/mylib.so: undefined symbol: _ZSt4cerr
> -
>
> I will really appreciate any help or advice on this problem.
>
> --Samsiddhi Bhattacharjee
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cat cannot write more than 10000 characters? [R 2.8.1]

2009-01-11 Thread Prof Brian Ripley
Looks like a bug in your iconv.  However, that section of code is 
conditionalized by


if(con->outconv) { /* translate the buffer */

and I don't see that as non-NULL on my systems.  It should only be 
called when you specify an encoding on the output connection, so have 
you set an option (e.g. "encoding")  without telling us?


I was able to reproduce a similar problem by

cat(testChunk, sep = "\n", file = file("output", encoding="latin1"),
append = TRUE)

in a UTF-8 locale, and I'll add a workaround to the R sources.

Please do run your tests with R --vanilla and make sure they are 
complete -- see the posting guide.



On Mon, 5 Jan 2009, Daniel Sabanés Bové wrote:


Dear Prof. Ripley,

I have discovered that my cat function cannot write more than 1
characters to a text file.


I think you meant *bytes*, BTW.


You mean on a single line?

Yes. OOo tries to save space...

No, works for me on Mac OS X and x86_64 Fedora 8 (as does 10x larger).
Can you run this under a debugger and find where it is going wrong for
you?

Oh, then this might be distribution- or gcc-version-specific:
gcc --version
gcc (SUSE Linux) 4.3.2 [gcc-4_3-branch revision 141291]

glibc is version 2.9-2.3.

Using ddd I found the (relevant part of the) backtrace when interrupting
the infinite loop:

(gdb) backtrace
#0  __gconv (cd=0x846cde0, inbuf=0xbfff7738, inbufend=0x84ca589 "",
outbuf=0xbfff773c, outbufend=0xbfff9e57 "", irreversible=0xbfff76a8) at
gconv.c:80

The program comes here more than 100 000 times... with outbuf and inbuf
always being "\0".

#1  0xb7b581e7 in iconv (cd=0x846cde0, inbuf=0xbfff7738,
inbytesleft=0xbfff7734, outbuf=0xbfff773c, outbytesleft=0xbfff7730) at
iconv.c:53
[this is   result = __gconv (gcd, (const unsigned char **) inbuf,
   (const unsigned char *)  (*inbuf + *inbytesleft),
 (unsigned char **) outbuf,
  (unsigned char *) (*outbuf + *outbytesleft),
  &irreversible);]

#2  0xb7e44d29 in Riconv (cd=0x846cde0, inbuf=0xbfff7738,
inbytesleft=0xbfff7734, outbuf=0xbfff773c, outbytesleft=0xbfff7730) at
sysutils.c:692
[ this is the only line of Riconv,  return iconv((iconv_t) cd,
(ICONV_CONST char **) inbuf, inbytesleft, outbuf, outbytesleft);]

#3  0xb7d2c337 in dummy_vfprintf (con=0x8400bb0, format=0xb7ee0c48 "%s",
ap=0xbfffc604 "\230¾L\b°?\005\b¬h\a\b¬h\a\b°?\005\b°?\005\b\001") at
connections.c:316
[this is  ires = Riconv(con->outconv, &ib, &inb, &ob, &onb);]

The infinite loop seems to be inside dummy_vfprintf, as this position is
the "highest" inside the backtrace which is reached again and again. And
at line 249 appears the magic number 1 as BUFSIZE, which is indeed
selected by the preprocessor in my environment!

#4  0xb7d2c4fa in file_vfprintf (con=0x8400bb0, format=0xb7ee0c48 "%s",
ap=0xbfffc604 "\230¾L\b°?\005\b¬h\a\b¬h\a\b°?\005\b°?\005\b\001") at
connections.c:579
[this is  if(con->outconv) return dummy_vfprintf(con, format, ap);]

This and everything above is only reached once, so this might be OK.

#5  0xb7dfe069 in Rvprintf (format=0xb7ee0c48 "%s", arg=0xbfffc604
"\230¾L\b°?\005\b¬h\a\b¬h\a\b°?\005\b°?\005\b\001") at printutils.c:785
[this is   (con->vfprintf)(con, format, argcopy);]

#6  0xb7dfe244 in Rprintf (format=0xb7ee0c48 "%s") at printutils.c:679
[this is   Rvprintf(format, ap);]

#7  0xb7d0446c in do_cat (call=0x83032a8, op=0x806b7d4, args=, rho=0x830359c) at builtin.c:597
[this is   Rprintf("%s", p);]

Unfortunately, I'm not experienced in R/C code internals, but if you
have detailed instructions for me (like "show me the value of this
variable after 1 stops") I can provide more debugging info.

cat(testChunk, sep = "\n", file = output, append = TRUE)

We have writeLines() for that and it is more efficient, especially if
you keep a connection open.

OK, maybe Prof. Leisch wants to improve the Sweave code...?

Thank you very much for your help,
best regards,
Daniel Sabanes



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] NY Times article

2009-01-11 Thread Frank E Harrell Jr
Nicholas please note that FDA does not require type III sums of squares, 
LOCF, and the use of any particular software.  It is just the fact that 
FDA does not disallow type III tests and LOCF that prevents pharma from 
abandoning these terrible methods.


Frank
--
Frank E Harrell Jr   Professor and Chair   School of Medicine
 Department of Biostatistics   Vanderbilt University

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] code validation (was Re: NY Times article)

2009-01-11 Thread Marc Schwartz
on 01/10/2009 03:06 PM Spencer Graves wrote:
> Hi, All:
>  What support exists for 'regression testing'
> (http://en.wikipedia.org/wiki/Regression_testing) of R code, e.g., as
> part of the "R CMD check" process?
>  The"RUnit" package supports "unit testing"
> (http://en.wikipedia.org/wiki/Unit_testing).
>  Those concerned about software quality of code they use regularly
> could easily develop their own "softwareChecks" package that runs unit
> tests in the "\examples".  Then each time a new version of the package
> and / or R is downloaded, you can do "R CMD check" of your
> "softwareChecks":  If it passes, you know that it passed all your checks.
> 
>  I have not used "RUnit", but I've done similar things computing the
> same object two ways then doing "stopifnot(all.equal(obj1, obj2))".  I
> think the value of the help page is enhanced by showing the "all.equal"
> but not the "stopifnot".  I achieve this using "\dontshow" as follows:
> 
>   obj1 <- ...
>   obj2 <- ...
>   \dontshow{stopifnot(}
>   all.equal(obj1, obj2)
>   \dontshow{)}.
> 
>  Examples of this are contained, for example, in "fRegress.Rd" in
> the current "fda" package available from CRAN or R-Forge.
> 
>  Best Wishes,
>  Spencer

Spencer,

I think that there are two separate issues being raised here.

One is, how does R Core implement and document an appropriate software
development life cycle (SDLC), which covers the development, testing and
maintenance of R itself. This would include "Base" R and the
"Recommended Packages".

The second is how does an end user do the same with respect to their own
use of R and their own R code development.

I'll answer this one first, which is essentially that it is up to the
end user and their organization. There is an intrinsic mis-understanding
if an end user believes that the majority of the burden for this is on R
Core or that it is up to R Core to facilitate the end user's internal QA
processes.

If the end user is in an environment that requires formalized IQ/OQ/PQ
types of processes (building jet engines for example...), then there
will be SOPs in place that define how these are to be accomplished. The
SOPs may need to be adjusted to R's characteristics, but they should be
in place. The end user needs to be familiar with these, implement
appropriate mechanisms that are compliant with them and operate within
those parameters to reasonably ensure the quality, consistency and
repeatability of their work.

This is no different with R than any other mission critical application.
If somebody is using SAS to build jet engines and they have not
implemented internal processes that establish and document SAS'
performance, beyond that which the SAS Institute documents, the end user
and their organization are out on a limb in terms of risk.

With respect to the first issue and R itself, as you may be aware, there
is a document available here:

  http://www.r-project.org/doc/R-FDA.pdf

which while geared towards the regulated clinical trials realm,
documents the R SDLC and related issues. The key is to document what R
Core does so that end users can be cognizant of the internal quality
processes in place and that an appropriate understanding of these can be
achieved. Given the focus of the document, it also covers other
regulatory issues applicable to clinical trials (eg. 21 CFR 11).

In that document, I would specifically point you to Section 6 which
covers R's SDLC.

Note that this document DOES NOT cover CRAN add-on packages in any
fashion, as the SDLC for those packages is up to the individual authors
and maintainers to define and document. If a user is installing and
using CRAN add-on packages, then they should communicate with those
package authors to identify their SDLC and have then implement their own
internal processes to test them.

The typical "R CMD check" process essentially only tests that the
package is a valid CRAN package, unless the CRAN package author has
implemented their own testing process (eg. 'tests' sub-dir) with
additional code such as R Core has done when using procedures such as
'make check-all' subsequent to compiling R from source code.

HTH,

Marc Schwartz

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cat cannot write more than 10000 characters? [R 2.8.1]

2009-01-11 Thread Prof Brian Ripley

On Sun, 11 Jan 2009, Daniel Sabanés Bové wrote:


Yes, I set the encoding to UTF-8 in my .Rprofile. Sorry that I didn't


You really don't want to do that: it adds a considerable overhead and 
relies on a bug-free iconv 


The latest R-patched should work around this.


mention it already. So the complete stand-alone test code which fails in
R --vanilla is the following:

### code begin
options (encoding = "utf-8")
testChunk <- paste(rep("a", 1 + 1), ## delete "+ 1" to be successful
  collapse="")
output <- tempfile()
cat(testChunk, sep = "\n", file = output, append = TRUE)
### code end

And the version and locale of my system are

R version 2.8.1 (2008-12-22)
i686-pc-linux-gnu
locale:
LC_CTYPE=de_DE.UTF-8;LC_NUMERIC=C;LC_TIME=de_DE.UTF-8;LC_COLLATE=de_DE.UTF-8;LC_MONETARY=C;LC_MESSAGES=de_DE.UTF-8;LC_PAPER=de_DE.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=de_DE.UTF-8;LC_IDENTIFICATION=C


Prof Brian Ripley schrieb:

Looks like a bug in your iconv.  However, that section of code is
conditionalized by

if(con->outconv) { /* translate the buffer */

and I don't see that as non-NULL on my systems.  It should only be
called when you specify an encoding on the output connection, so have
you set an option (e.g. "encoding")  without telling us?

I was able to reproduce a similar problem by

cat(testChunk, sep = "\n", file = file("output", encoding="latin1"),
append = TRUE)

in a UTF-8 locale, and I'll add a workaround to the R sources.

Please do run your tests with R --vanilla and make sure they are
complete -- see the posting guide.


On Mon, 5 Jan 2009, Daniel Sabanés Bové wrote:


Dear Prof. Ripley,

I have discovered that my cat function cannot write more than 1
characters to a text file.


I think you meant *bytes*, BTW.





--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] NY Times article

2009-01-11 Thread Nicholas Lewin-Koh
Hi Frank,
Thank you for the correction, I don't work in clinical, so good you keep
me honest.
Nicholas
On Sun, 11 Jan 2009 09:21:23 -0600, "Frank E Harrell Jr"
 said:
> Nicholas please note that FDA does not require type III sums of squares, 
> LOCF, and the use of any particular software.  It is just the fact that 
> FDA does not disallow type III tests and LOCF that prevents pharma from 
> abandoning these terrible methods.
> 
> Frank
> -- 
> Frank E Harrell Jr   Professor and Chair   School of Medicine
>   Department of Biostatistics   Vanderbilt University

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R as a scripting engine

2009-01-11 Thread Prof Brian Ripley
Those of you tracking R development will have noticed that we are 
moving towards using R as a scripting engine.  (It is often overlooked 
as such.)  Thus far INSTALL, REMOVE, SHLIB and massage-examples have 
been moved to R.


Reasons:

- it is platform-independent and needs no other tools installed.
  No need to worry about strange 'sh' variants (as on AIX and in the
  past on Mac OS X) or to track Perl version changes and miniumun
  requirements.

- it is fast.  Shell scripts are notoriously slow[*] on Windows as
  launching a new process there is relatively much slower than on a
  Unix-alike, to the extent that we have had to use Perl scripts on
  Windows. The actual script of INSTALL in R runs twice as fast as the
  shell script on Unix or the Perl script on Windows.  (In part this
  comes about because R scripts can be pre-parsed if put in a package:
  the 1150 lines of INSTALL was taking a couple of seconds to parse.)

- it is much easier to maintain.  We only need one version, and any R
  user should be able to folllow it (and any member of R-core to
  maintain it).  And one version means only one set of the
  documentation and automatically keeping things in step.

- R has lots of useful features, including automatic support of
  (for files) different line endings and encodings.

Of course there is the one-time cost of doing the conversion, which 
has been triggered here by the need for new functionality.  It also 
means that there are minor changes to merge the Unix and Windows 
versions.  (Some of you will be hearing from Uwe Ligges about use of 
undocumented variables in Windows configure/Makevars.win, for 
example.)


In due course we see phasing out the use of Perl, at least at run time 
(and that might even be for 2.9.0).


[*] Running configure under Cygwin took over an hour on a fairly 
laoded Windows box earlier in the week.


--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R as a scripting engine

2009-01-11 Thread Gabor Grothendieck
On Sun, Jan 11, 2009 at 3:18 PM, Prof Brian Ripley
 wrote:
> Those of you tracking R development will have noticed that we are moving
> towards using R as a scripting engine.  (It is often overlooked as such.)
...
> In due course we see phasing out the use of Perl, at least at run time (and
> that might even be for 2.9.0).

That's great news.  I for one will not be sorry to see it and the
shell scripts go.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R as a scripting engine

2009-01-11 Thread Dirk Eddelbuettel

On 11 January 2009 at 20:18, Prof Brian Ripley wrote:
| Those of you tracking R development will have noticed that we are 
| moving towards using R as a scripting engine.
[...]
| Reasons:
| 
| - it is platform-independent and needs no other tools installed.
[...]
| - it is fast.
[...]

Indeed.  I really like working with r scripts. 

And littler by Horner and Eddelbuettel is faster than Rscript -- see eg the
scripts tests/timing.sh and tests/timing2.sh in the SVN archive / littler
tarballs (and the results below for illustration).   

We should still appreciate it you could finally acknowledge existence of
littler it in the R / Rscript documentation.   You are not doing users any
service by pretending it doesn't exist.  

That said, we are not (yet ?) building r for Windows, and I appreciate that
Rscript is available there.  Maintenance and use of R will be easier with a
consistent set of tools.  This is a good move.

Dirk


e...@ron:~/svn/littler/tests> ./timing.sh

 --- GNU bc doing the addition 10 times
real0m0.028s
user0m0.004s
sys 0m0.012s

 --- our r doing the addition 10 times
real0m0.400s
user0m0.308s
sys 0m0.052s

 --- GNU R's Rscript doing the addition 10 times
real0m2.077s
user0m1.832s
sys 0m0.204s

 --- GNU R doing the addition 10 times
real0m3.974s
user0m3.728s
sys 0m0.228s
e...@ron:~/svn/littler/tests> ./timing2.sh

 --- our r calling summary() 20 times
real0m3.261s
user0m2.976s
sys 0m0.240s

 --- GNU R's Rscript calling summary() 20 times
real0m4.164s
user0m3.624s
sys 0m0.548s

 --- GNU R calling summary() 20 times
real0m8.087s
user0m7.552s
sys 0m0.492s
e...@ron:~/svn/littler/tests>


-- 
Three out of two people have difficulties with fractions.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Package Matrix does not compile in R-devel_2009-01-10

2009-01-11 Thread Rainer Hurling

Dear developers,

today I tried to build and install R-devel_2009-01-10 on FreeBSD 
8.0-CURRENT (i386) for testing purposes.


All went well until compiling the now recommended (integrated) Matrix 
package. At this point the following break occured:



begin installing recommended package Matrix
* Installing *source* package 'Matrix' ...
** libs
gcc -std=gnu99 -I/usr/local/R-devel/include -I./UFconfig 
-I/usr/local/include-fpic  -g -O2 -c CHMfactor.c -o CHMfactor.o

[..snip..]
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include 
-I../../UFconfig -I/usr/local/include-fpic  -g -O2 -c 
colamd_global.c -o colamd_global.o
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include 
-I../../UFconfig -I/usr/local/include-fpic  -g -O2 -I../Include 
-DDLONG -c colamd.c -o colamd_l.o

ar -rucs ../../COLAMD.a colamd_global.o colamd_l.o # colamd.o
( cd Source ; make lib )
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include 
-I../../UFconfig -I/usr/local/include-fpic  -g -O2 -c amd_global.c 
-o amd_global.o

make: don't know how to make amd_l_1.o. Stop
*** Error code 2
Stop in /tmp/Rtmpx5nUS8/R.INSTALL10d63af1/Matrix/src/AMD.
*** Error code 1
Stop in /tmp/Rtmpx5nUS8/R.INSTALL10d63af1/Matrix/src.
ERROR: compilation failed for package 'Matrix'
* Removing '/usr/local/R-devel/library/Matrix'
*** Error code 1
Stop in /usr/local/R-devel/src/library/Recommended.
*** Error code 1
Stop in /usr/local/R-devel/src/library/Recommended.
*** Error code 1
Stop in /usr/local/R-devel.



Please note, that on FreeBSD there is a BSD 'make' as default. If I want 
to use gmake instead, I explicitly have to set it. Unfortunately this 
does not work within building the whole R-devel system.


With R-2.8.1 I have no problems installing and using R. When I want to 
build the (external) Matrix package I have to set an environment 
variable to gmake, found at /usr/local/bin/gmake and all works well.



Now my question: Is it possible to change the configure/build of the 
integrated Matrix package on R-devel? For all other packages there is no 
need to do so (at least for FreeBSD ;-)


Please let me know if I can help.

Thanks in advance,
Rainer Hurling

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] complex-valued sparse matrices

2009-01-11 Thread Juha Vierinen
Hi,

I was looking at the sparse matrix packages and I noticed that complex
valued sparse matrices are not supported. Is there anybody working on
adding complex-support for the SparseM or the Matrix package?

juha

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Package Matrix does not compile in R-devel_2009-01-10

2009-01-11 Thread Prof Brian Ripley
You need to take this up with the package maintainers: although 
recommended packages are distributed with R,  they are still 
contributed packages with separate maintainers.


At one point Matrix did work with a non-GNU make (the Solaris one) 
after suggestions from R-core members on how to remove the obvious 
GNUisms.  It would cetainly be helpful to let the package maintainers 
know what changes do work.  (I presume the issue is


amd_i_%.o: amd_%.c $(INC)
$(CC) $(ALL_CPPFLAGS) $(ALL_CFLAGS) -I../Include -DDINT -c $< 
-o $@

amd_l_%.o: amd_%.c $(INC)
$(CC) $(ALL_CPPFLAGS) $(ALL_CFLAGS) -I../Include -DDLONG -c $< 
-o $@


which are new rules since I was able to test.)

Unfortunately I can no longer build Matrix (and hence R-devel) on 
Solaris, as the Sun Studio compilers say some of the C++ code is 
invalid (and it looks so to me, and I reported it a while back): the 
file is spqr_front.cpp, so it has not got as far as the point that is 
giving you trouble.



On Sun, 11 Jan 2009, Rainer Hurling wrote:


Dear developers,

today I tried to build and install R-devel_2009-01-10 on FreeBSD 8.0-CURRENT 
(i386) for testing purposes.


All went well until compiling the now recommended (integrated) Matrix 
package. At this point the following break occured:



begin installing recommended package Matrix
* Installing *source* package 'Matrix' ...
** libs
gcc -std=gnu99 -I/usr/local/R-devel/include -I./UFconfig -I/usr/local/include 
-fpic  -g -O2 -c CHMfactor.c -o CHMfactor.o

[..snip..]
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include -I../../UFconfig 
-I/usr/local/include-fpic  -g -O2 -c colamd_global.c -o colamd_global.o
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include -I../../UFconfig 
-I/usr/local/include-fpic  -g -O2 -I../Include -DDLONG -c colamd.c -o 
colamd_l.o

ar -rucs ../../COLAMD.a colamd_global.o colamd_l.o # colamd.o
( cd Source ; make lib )
gcc -std=gnu99 -I/usr/local/R-devel/include -I../Include -I../../UFconfig 
-I/usr/local/include-fpic  -g -O2 -c amd_global.c -o amd_global.o

make: don't know how to make amd_l_1.o. Stop
*** Error code 2
Stop in /tmp/Rtmpx5nUS8/R.INSTALL10d63af1/Matrix/src/AMD.
*** Error code 1
Stop in /tmp/Rtmpx5nUS8/R.INSTALL10d63af1/Matrix/src.
ERROR: compilation failed for package 'Matrix'
* Removing '/usr/local/R-devel/library/Matrix'
*** Error code 1
Stop in /usr/local/R-devel/src/library/Recommended.
*** Error code 1
Stop in /usr/local/R-devel/src/library/Recommended.
*** Error code 1
Stop in /usr/local/R-devel.



Please note, that on FreeBSD there is a BSD 'make' as default. If I want to 
use gmake instead, I explicitly have to set it. Unfortunately this does not 
work within building the whole R-devel system.


It would be helpful to know why not.  AFAIK GNU make works on other 
platforms with their own make.


With R-2.8.1 I have no problems installing and using R. When I want to build 
the (external) Matrix package I have to set an environment variable to gmake, 
found at /usr/local/bin/gmake and all works well.



Now my question: Is it possible to change the configure/build of the 
integrated Matrix package on R-devel? For all other packages there is no need 
to do so (at least for FreeBSD ;-)


Please let me know if I can help.

Thanks in advance,
Rainer Hurling

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel