Re: [Rd] proposal for new flag to R CMD INSTALL

2010-12-03 Thread Prof Brian Ripley
Why do you think dependencies are not needed for the help system? At 
some point they will be needed to resolve cross-references: at install 
time if you are generating static HTML.


I've not seen any issues that cannot be solved by --install=fake: I 
have fake installs of a few packages (e.g. ROracle) to allow others to 
be tested (including their help pages).



On Mon, 29 Nov 2010, Kjetil Halvorsen wrote:


For the purpose of helping in installing only parts of a package
(in my case, the help system), R CMD INSTALL
should accept a flag   --no-check-deps
Below is a diff for R-devel, svn revision53672



kje...@kjetil:~/R/R-devel/src/library/tools/R$ diff install.R.old install.R
116a117

"  --no-check-deps  skip test if installed depends/imports",

506c507
< if (length(miss) > 1)
---

if ((length(miss) > 1) && check_deps)

510c511
< else if (length(miss))
---

else if (length(miss) && check_deps)

1025a1027

check_deps <- TRUE

1133a1136,1137

} else if (a == "--no-check-deps") {
check_deps <- FALSE




kjetil

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Clean up after "R CMD INSTALL" and/or "R CMD check"

2010-12-03 Thread Berwin A Turlach
G'day all,

I noticed the following (new) behaviour of R 2.12.0, running on Kubuntu
10.10, when installed with sub-architectures:

When I run "R CMD INSTALL" or "R CMD check" on the source directory of a
package that contains C or FORTRAN code, R creates sub-directories
src-32/ and src-64/ that seem to be copies of the src/ subdirectory
plus the compiled objects.   

These directories are not deleted at the end of a successful
INSTALL/check and I wonder if there is any particular reason for this?
Would it be possible to delete these sub-directories during clean-up
at the end of a successful INSTALL/check?

Cheers,

Berwin

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Strange problems with compiling dll

2010-12-03 Thread Oleksandr Dyklevych


Dear sir\madam!

I'm trying to speed up my R code by writing quite simple dll's in C. But I
faced some problems, and I cannot determine their source.

#include 

SEXP mycombin(SEXP N, SEXP k){
  int i, *j, *l, c;
  j = INTEGER(k);l = INTEGER(N);
  c = 1;
  if(j[0] > 0 && j[0] < l[0]){
  if(j[0] <= l[0] - j[0]){
  for(i = l[0]; i >= l[0] - j[0] + 1; i--){
  c = c * i / (l[0] - i + 1);
  }
  }
  else{
  for(i = l[0]; i <= j[0] + 1; i++){
  c = c * i / (l[0] - i + 1);
  }
  }
  }
  return ScalarInteger(c);
}

But, when I try to compile it I have 5 errors, and all of them come form
linker and say next (Code::Blocks):

||=== mcb2, Release ===|
obj\Release\conv.o:conv.c|| undefined reference to `INTEGER'|
obj\Release\conv.o:conv.c|| undefined reference to `INTEGER'|
obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
||=== Build finished: 5 errors, 0 warnings ===|


I found out that I need to link Rdll.lib, but to make it I should use
these instructions

make R.exp
lib /def:R.exp /out:Rdll.lib

but i cannot figure out where I should use them.
I'm trying from here C:\Rtools\src\gnuwin32 but each time i get
make: *** No rule to make target `R.exp'.  Stop.

So where should I state this rule and which rule?...

Will you help me, please, to "connect" my C compiler to R?

Best regards,
Oleksandr

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Clean up after "R CMD INSTALL" and/or "R CMD check"

2010-12-03 Thread Prof Brian Ripley

On Fri, 3 Dec 2010, Berwin A Turlach wrote:


G'day all,

I noticed the following (new) behaviour of R 2.12.0, running on Kubuntu
10.10, when installed with sub-architectures:


Yes, there are new features when there are multiple sub-architectures.


When I run "R CMD INSTALL" or "R CMD check" on the source directory of a
package that contains C or FORTRAN code, R creates sub-directories
src-32/ and src-64/ that seem to be copies of the src/ subdirectory
plus the compiled objects.

These directories are not deleted at the end of a successful
INSTALL/check and I wonder if there is any particular reason for this?


Because it might be partially successful and you want to look at the 
generated objects?  In particular 'success' means that the primary 
sub-architecture is installed: others might fail.



Would it be possible to delete these sub-directories during clean-up
at the end of a successful INSTALL/check?


Try INSTALL --clean, etc.  But I normally do this from a tarball to 
keep the sources clean and to test the reference sources.


There are a few improvements to R-patched in the detection of 
sub-architectures, so you might like to see if you prefer what it 
does.


--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Strange problems with compiling dll

2010-12-03 Thread Dirk Eddelbuettel

On 3 December 2010 at 08:46, Oleksandr Dyklevych wrote:
| I'm trying to speed up my R code by writing quite simple dll's in C. But I

Yes, that can be a very appropriate approach and many tools help. Make sure
you read the 'Writing R Extensions' manual though.

| faced some problems, and I cannot determine their source.
| 
| #include 
| 
| SEXP mycombin(SEXP N, SEXP k){
|int i, *j, *l, c;
|j = INTEGER(k);l = INTEGER(N);
|c = 1;
|if(j[0] > 0 && j[0] < l[0]){
|if(j[0] <= l[0] - j[0]){
|for(i = l[0]; i >= l[0] - j[0] + 1; i--){
|c = c * i / (l[0] - i + 1);
|}
|}
|else{
|for(i = l[0]; i <= j[0] + 1; i++){
|c = c * i / (l[0] - i + 1);
|}
|}
|}
|return ScalarInteger(c);
| }
| 
| But, when I try to compile it I have 5 errors, and all of them come form
| linker and say next (Code::Blocks):
| 
| ||=== mcb2, Release ===|
| obj\Release\conv.o:conv.c|| undefined reference to `INTEGER'|
| obj\Release\conv.o:conv.c|| undefined reference to `INTEGER'|
| obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
| obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
| obj\Release\conv.o:conv.c|| undefined reference to `Rf_ScalarInteger'|
| ||=== Build finished: 5 errors, 0 warnings ===|

The error messages make me suspect that you are using an unsupported
toolchain.   That is your right, but you are unlikely to find help.  All
documentation suggests to use the gcc/g++ combinations (with minor exceptions
such as the native Solaris compiler etc pp). 

| I found out that I need to link Rdll.lib, but to make it I should use
| these instructions
| 
| make R.exp
| lib /def:R.exp /out:Rdll.lib

That is very clearly unsupported.  Re-visit 'Writing R Extensions', in
particular the Appendix on the Windows toolchain, and try again.
 
| but i cannot figure out where I should use them.
| I'm trying from here C:\Rtools\src\gnuwin32 but each time i get
| make: *** No rule to make target `R.exp'.  Stop.
| 
| So where should I state this rule and which rule?...
| 
| Will you help me, please, to "connect" my C compiler to R?

No. That approach is clearly documented as being unsupported -- so you cannot
expect answers on the principal development list for R.

Dirk

-- 
Dirk Eddelbuettel | e...@debian.org | http://dirk.eddelbuettel.com

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Clean up after "R CMD INSTALL" and/or "R CMD check"

2010-12-03 Thread Berwin A Turlach
G'day Brian,

On Fri, 3 Dec 2010 13:14:44 + (GMT)
Prof Brian Ripley  wrote:

> > I noticed the following (new) behaviour of R 2.12.0, running on
> > Kubuntu 10.10, when installed with sub-architectures:
> 
> Yes, there are new features when there are multiple sub-architectures.

Indeed.  One new feature seems to be that if the installation of a
package via
R arch=XXX CMD INSTALL --libs-only
fails then the package is not completely removed but rather the
previously install version is re-installed.  IIRC, I had requested this
behaviour some years ago and it is nice to see it now implemented. :)
 
> > These directories are not deleted at the end of a successful
> > INSTALL/check and I wonder if there is any particular reason for
> > this?
> 
> Because it might be partially successful and you want to look at the 
> generated objects?  

I agree that it would be helpful to look at the generated objects if
the INSTALL/check is only partially successful, that's why I asked
about a successful INSTALL/check.  However, it looks...

> In particular 'success' means that the primary sub-architecture is
> installed: others might fail.

... as if we have different definitions of what constitutes 'success';
I take 'success' as meaning successful installation for all
architectures, but accept that you are using the official definition. :)

> > Would it be possible to delete these sub-directories during clean-up
> > at the end of a successful INSTALL/check?
> 
> Try INSTALL --clean, etc.  

This does not seem to help, the directories in question are not removed.

> But I normally do this from a tarball to keep the sources clean and
> to test the reference sources.

I used to do this too but changed my habits when it was once pointed out
to me that the section "Checking and building packages" in the "Writing
R Extensions" manual starts with:

Before using these tools, please check that your package can be
installed and loaded.  @code{R CMD check} will @emph{inter
alia} do this, but you may get more detailed error messages
doing the checks directly.
 
IIRC, the context was that it took me some time to track down a problem
via "R CMD check foo.tar.gz" as the error messages were not as helpful
in locating the problem as the error messages of "R CMD INSTALL" would
have been.  But if "R CMD INSTALL" is to be run before "R CMD check"
and/or "R CMD build" it has to be run on the source directory, hasn't
it?  This looks like a chicken-and-egg problem. :)

Or are you now saying that it is o.k. to first run "R CMD build" and
then "R CMD INSTALL" on the tarball?

> There are a few improvements to R-patched in the detection of 
> sub-architectures, so you might like to see if you prefer what it 
> does.

I tried with:
   R version 2.13.0 Under development (unstable) (2010-12-02 r53747)
and
   R version 2.12.0 Patched (2010-12-02 r53747)
and I did not see any different behaviour.  The subdirectories src-32/
and src-64/ are created and not deleted.

Thank you very much for your comments/insights. 

Cheers,

Berwin

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Competing with one's own work

2010-12-03 Thread Prof. John C Nash
No, this is not about Rcpp, but a comment in that overly long discussion raised 
a question
that has been in my mind for a while.

This is that one may have work that is used in R in the base functionality and 
there are
improvements that should be incorporated.

For me, this concerns the BFGS, Nelder-Mead and CG options of optim(), which 
are based on
the 1990 edition (Pascal codes) of my 1979 book "Compact numerical methods...", 
which were
themselves derived from other people's work. By the time Brian Ripley took that 
work (with
permission, even though not strictly required. Thanks!) there were already some
improvements to these same algorithms (mainly bounds and masks) in the BASIC 
codes of the
1987 book by Mary Walker-Smith and I. However, BASIC to R is not something I'd 
wish on
anyone.

Now there are some R packages, including some I've been working on, that do 
offer
improvements on the optim() offerings. I would not say mine are yet fully ready 
for
incorporation into the base, but they are pretty close. Equally I think some of 
the tools
in the base should be deprecated and users encouraged to try other routines. It 
is also
getting more and more important that novice users be provided with sensible 
guidance and
robust default settings and choices. In many areas, users are faced with more 
choice than
is efficient for the majority of problems.

My question is: How should such changes be suggested / assisted? It seems to me 
that this
is beyond a simple feature request. Some discussion on pros and cons would be 
appropriate,
and those like myself who are familiar with particular tools can and should 
offer help.

Alternatively, is there a document available in the style "Writing R 
Extensions" that has
a title like "How the R Base Packages are Updated"? A brief search was negative.

I'm happy to compete with my own prior work to provide improvements. It would 
be nice to
see some of those improvements become the benchmark for further progress.


Best,

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Competing with one's own work

2010-12-03 Thread Duncan Murdoch

On 03/12/2010 10:57 AM, Prof. John C Nash wrote:

No, this is not about Rcpp, but a comment in that overly long discussion raised 
a question
that has been in my mind for a while.

This is that one may have work that is used in R in the base functionality and 
there are
improvements that should be incorporated.

For me, this concerns the BFGS, Nelder-Mead and CG options of optim(), which 
are based on
the 1990 edition (Pascal codes) of my 1979 book "Compact numerical methods...", 
which were
themselves derived from other people's work. By the time Brian Ripley took that 
work (with
permission, even though not strictly required. Thanks!) there were already some
improvements to these same algorithms (mainly bounds and masks) in the BASIC 
codes of the
1987 book by Mary Walker-Smith and I. However, BASIC to R is not something I'd 
wish on
anyone.

Now there are some R packages, including some I've been working on, that do 
offer
improvements on the optim() offerings. I would not say mine are yet fully ready 
for
incorporation into the base, but they are pretty close. Equally I think some of 
the tools
in the base should be deprecated and users encouraged to try other routines. It 
is also
getting more and more important that novice users be provided with sensible 
guidance and
robust default settings and choices. In many areas, users are faced with more 
choice than
is efficient for the majority of problems.

My question is: How should such changes be suggested / assisted? It seems to me 
that this
is beyond a simple feature request. Some discussion on pros and cons would be 
appropriate,
and those like myself who are familiar with particular tools can and should 
offer help.

Alternatively, is there a document available in the style "Writing R 
Extensions" that has
a title like "How the R Base Packages are Updated"? A brief search was negative.

I'm happy to compete with my own prior work to provide improvements. It would 
be nice to
see some of those improvements become the benchmark for further progress.



There are answers at many different levels to your questions.  The 
simplest is that base packages are part of R, so they get updated when a 
member of R Core updates them, and the updates get released when a new 
version of R is released.


So if you want a change, you need to convince a member of the core to 
make it.  Pointing out a bug is the easiest way to do this:  bugs 
usually get fixed quickly, if they are clearly demonstrated.


If you want a bigger change, you need to make a convincing argument in 
favour of it.  If you pick a topic that is of particular interest to one 
core member, and you can convince him to make the change, then it will 
happen.  If pick some obscure topic that's not of interest to anyone, 
you'll need a very strong argument to make it interesting.  Part of any 
of these arguments is explaining why the change needs to be made to the 
base, why it can't just be published in a contributed package.  (That's 
why bug fixes are easy, and big additions to the base packages are not.)


Duncan Murdoch

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Competing with one's own work

2010-12-03 Thread Ravi Varadhan
Dear Duncan, 

What constitutes a convincing argument for making significant changes?
Taking the example of optimization algorithms (say, for smooth objective
functions), how does one make a convincing argument that a particular class
of algorithms are "better" than another class? This can be a difficult task,
but quite doable with good benchmarking practices.  

Supposing for the moment that such a convincing argument has been made, is
that sufficient to get the R-core to act upon it?  Are there compelling
factors other than just "algorithm A is better than algorithm B"?

I'd think that the argument is relatively easy if the need for the change is
driven by consumer demand. But, even here I am not sure how to make an
argument to the R-core to consider the big changes.  For example, there is a
reasonable demand for constrained (smooth) optimization algorithms in R
(based on R-help queries).  Currently, there are only 3 packages that can
handle this.  However, in the base distribution only `constrOptim' function
is provided, which cannot handle anything more than linear, inequality
constraints.  I think that the base distribution needs to have a package for
constrained optimization that can handle linear/nonlinear and
equality/inequality constraints.  

John, thanks for raising an important issue.

Thanks & Best,
Ravi.

---
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns
Hopkins University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu


-Original Message-
From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
On Behalf Of Duncan Murdoch
Sent: Friday, December 03, 2010 11:13 AM
To: nas...@uottawa.ca
Cc: r-devel@r-project.org
Subject: Re: [Rd] Competing with one's own work

On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
> No, this is not about Rcpp, but a comment in that overly long discussion
raised a question
> that has been in my mind for a while.
>
> This is that one may have work that is used in R in the base functionality
and there are
> improvements that should be incorporated.
>
> For me, this concerns the BFGS, Nelder-Mead and CG options of optim(),
which are based on
> the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
methods...", which were
> themselves derived from other people's work. By the time Brian Ripley took
that work (with
> permission, even though not strictly required. Thanks!) there were already
some
> improvements to these same algorithms (mainly bounds and masks) in the
BASIC codes of the
> 1987 book by Mary Walker-Smith and I. However, BASIC to R is not something
I'd wish on
> anyone.
>
> Now there are some R packages, including some I've been working on, that
do offer
> improvements on the optim() offerings. I would not say mine are yet fully
ready for
> incorporation into the base, but they are pretty close. Equally I think
some of the tools
> in the base should be deprecated and users encouraged to try other
routines. It is also
> getting more and more important that novice users be provided with
sensible guidance and
> robust default settings and choices. In many areas, users are faced with
more choice than
> is efficient for the majority of problems.
>
> My question is: How should such changes be suggested / assisted? It seems
to me that this
> is beyond a simple feature request. Some discussion on pros and cons would
be appropriate,
> and those like myself who are familiar with particular tools can and
should offer help.
>
> Alternatively, is there a document available in the style "Writing R
Extensions" that has
> a title like "How the R Base Packages are Updated"? A brief search was
negative.
>
> I'm happy to compete with my own prior work to provide improvements. It
would be nice to
> see some of those improvements become the benchmark for further progress.


There are answers at many different levels to your questions.  The 
simplest is that base packages are part of R, so they get updated when a 
member of R Core updates them, and the updates get released when a new 
version of R is released.

So if you want a change, you need to convince a member of the core to 
make it.  Pointing out a bug is the easiest way to do this:  bugs 
usually get fixed quickly, if they are clearly demonstrated.

If you want a bigger change, you need to make a convincing argument in 
favour of it.  If you pick a topic that is of particular interest to one 
core member, and you can convince him to make the change, then it will 
happen.  If pick some obscure topic that's not of interest to anyone, 
you'll need a very strong argument to make it interesting.  Part of any 
of these arguments is explaining why the change needs to be made to the 
base, why it can't just be published in a contributed package.  (That's 
why bug fixes are easy, and big additions to the base packages are not.)

Duncan Murdoch

___

Re: [Rd] Competing with one's own work

2010-12-03 Thread Douglas Bates
On Fri, Dec 3, 2010 at 11:01 AM, Ravi Varadhan  wrote:
> Dear Duncan,

> What constitutes a convincing argument for making significant changes?
> Taking the example of optimization algorithms (say, for smooth objective
> functions), how does one make a convincing argument that a particular class
> of algorithms are "better" than another class? This can be a difficult task,
> but quite doable with good benchmarking practices.

> Supposing for the moment that such a convincing argument has been made, is
> that sufficient to get the R-core to act upon it?  Are there compelling
> factors other than just "algorithm A is better than algorithm B"?

> I'd think that the argument is relatively easy if the need for the change is
> driven by consumer demand. But, even here I am not sure how to make an
> argument to the R-core to consider the big changes.  For example, there is a
> reasonable demand for constrained (smooth) optimization algorithms in R
> (based on R-help queries).  Currently, there are only 3 packages that can
> handle this.  However, in the base distribution only `constrOptim' function
> is provided, which cannot handle anything more than linear, inequality
> constraints.  I think that the base distribution needs to have a package for
> constrained optimization that can handle linear/nonlinear and
> equality/inequality constraints.

constrOptim is in the stats package, not the base package.  Functions
that are already in the required packages are maintained by R core.
If you know of bugs in such functions you should report them.  Because
there is a heavy burden in maintaining the large corpus of software in
R and its required packages, additions are viewed skeptically,
Adopting new capabilities and new code in a required package like
stats means that some member of R core has to be willing to maintain
it.  If the capabilities can be incorporated in a contributed package
then that is the preferred method of extending R. The burden of
maintaining the code, fixing bugs or other infelicities, etc. is on
the package maintainer.

I don't see anything in what you are proposing that could not be
incorporated in a contributed package.

> John, thanks for raising an important issue.
>
> Thanks & Best,
> Ravi.
>
> ---
> Ravi Varadhan, Ph.D.
> Assistant Professor,
> Division of Geriatric Medicine and Gerontology School of Medicine Johns
> Hopkins University
>
> Ph. (410) 502-2619
> email: rvarad...@jhmi.edu
>
>
> -Original Message-
> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
> On Behalf Of Duncan Murdoch
> Sent: Friday, December 03, 2010 11:13 AM
> To: nas...@uottawa.ca
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] Competing with one's own work
>
> On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
>> No, this is not about Rcpp, but a comment in that overly long discussion
> raised a question
>> that has been in my mind for a while.
>>
>> This is that one may have work that is used in R in the base functionality
> and there are
>> improvements that should be incorporated.
>>
>> For me, this concerns the BFGS, Nelder-Mead and CG options of optim(),
> which are based on
>> the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
> methods...", which were
>> themselves derived from other people's work. By the time Brian Ripley took
> that work (with
>> permission, even though not strictly required. Thanks!) there were already
> some
>> improvements to these same algorithms (mainly bounds and masks) in the
> BASIC codes of the
>> 1987 book by Mary Walker-Smith and I. However, BASIC to R is not something
> I'd wish on
>> anyone.
>>
>> Now there are some R packages, including some I've been working on, that
> do offer
>> improvements on the optim() offerings. I would not say mine are yet fully
> ready for
>> incorporation into the base, but they are pretty close. Equally I think
> some of the tools
>> in the base should be deprecated and users encouraged to try other
> routines. It is also
>> getting more and more important that novice users be provided with
> sensible guidance and
>> robust default settings and choices. In many areas, users are faced with
> more choice than
>> is efficient for the majority of problems.
>>
>> My question is: How should such changes be suggested / assisted? It seems
> to me that this
>> is beyond a simple feature request. Some discussion on pros and cons would
> be appropriate,
>> and those like myself who are familiar with particular tools can and
> should offer help.
>>
>> Alternatively, is there a document available in the style "Writing R
> Extensions" that has
>> a title like "How the R Base Packages are Updated"? A brief search was
> negative.
>>
>> I'm happy to compete with my own prior work to provide improvements. It
> would be nice to
>> see some of those improvements become the benchmark for further progress.
>
>
> There are answers at many different levels to your question

Re: [Rd] Competing with one's own work

2010-12-03 Thread Ravi Varadhan
Dear Doug,

Thank you for the response.

"constrOptim is in the stats package, not the base package."

Yes, I know, and I meant to say base *distribution* rather than base
package.  

"The burden of maintaining the code, fixing bugs or other infelicities, etc.
is on the package maintainer."

Of course.

"I don't see anything in what you are proposing that could not be
incorporated in a contributed package."

I agree, and it has already been done.  

What I am really asking is this: what is the rationale behind having a
package incorporated into the base distribution? 

Best,
Ravi.

---
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns
Hopkins University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu


-Original Message-
From: dmba...@gmail.com [mailto:dmba...@gmail.com] On Behalf Of Douglas
Bates
Sent: Friday, December 03, 2010 1:28 PM
To: Ravi Varadhan
Cc: Duncan Murdoch; nas...@uottawa.ca; r-devel@r-project.org
Subject: Re: [Rd] Competing with one's own work

On Fri, Dec 3, 2010 at 11:01 AM, Ravi Varadhan  wrote:
> Dear Duncan,

> What constitutes a convincing argument for making significant changes?
> Taking the example of optimization algorithms (say, for smooth objective
> functions), how does one make a convincing argument that a particular
class
> of algorithms are "better" than another class? This can be a difficult
task,
> but quite doable with good benchmarking practices.

> Supposing for the moment that such a convincing argument has been made, is
> that sufficient to get the R-core to act upon it?  Are there compelling
> factors other than just "algorithm A is better than algorithm B"?

> I'd think that the argument is relatively easy if the need for the change
is
> driven by consumer demand. But, even here I am not sure how to make an
> argument to the R-core to consider the big changes.  For example, there is
a
> reasonable demand for constrained (smooth) optimization algorithms in R
> (based on R-help queries).  Currently, there are only 3 packages that can
> handle this.  However, in the base distribution only `constrOptim'
function
> is provided, which cannot handle anything more than linear, inequality
> constraints.  I think that the base distribution needs to have a package
for
> constrained optimization that can handle linear/nonlinear and
> equality/inequality constraints.

constrOptim is in the stats package, not the base package.  Functions
that are already in the required packages are maintained by R core.
If you know of bugs in such functions you should report them.  Because
there is a heavy burden in maintaining the large corpus of software in
R and its required packages, additions are viewed skeptically,
Adopting new capabilities and new code in a required package like
stats means that some member of R core has to be willing to maintain
it.  If the capabilities can be incorporated in a contributed package
then that is the preferred method of extending R. The burden of
maintaining the code, fixing bugs or other infelicities, etc. is on
the package maintainer.

I don't see anything in what you are proposing that could not be
incorporated in a contributed package.

> John, thanks for raising an important issue.
>
> Thanks & Best,
> Ravi.
>
> ---
> Ravi Varadhan, Ph.D.
> Assistant Professor,
> Division of Geriatric Medicine and Gerontology School of Medicine Johns
> Hopkins University
>
> Ph. (410) 502-2619
> email: rvarad...@jhmi.edu
>
>
> -Original Message-
> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
> On Behalf Of Duncan Murdoch
> Sent: Friday, December 03, 2010 11:13 AM
> To: nas...@uottawa.ca
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] Competing with one's own work
>
> On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
>> No, this is not about Rcpp, but a comment in that overly long discussion
> raised a question
>> that has been in my mind for a while.
>>
>> This is that one may have work that is used in R in the base
functionality
> and there are
>> improvements that should be incorporated.
>>
>> For me, this concerns the BFGS, Nelder-Mead and CG options of optim(),
> which are based on
>> the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
> methods...", which were
>> themselves derived from other people's work. By the time Brian Ripley
took
> that work (with
>> permission, even though not strictly required. Thanks!) there were
already
> some
>> improvements to these same algorithms (mainly bounds and masks) in the
> BASIC codes of the
>> 1987 book by Mary Walker-Smith and I. However, BASIC to R is not
something
> I'd wish on
>> anyone.
>>
>> Now there are some R packages, including some I've been working on, that
> do offer
>> improvements on the optim() offerings. I would not say mine are yet fully
> ready for
>> incorporation into the base, bu

Re: [Rd] Competing with one's own work

2010-12-03 Thread Duncan Murdoch

On 03/12/2010 12:01 PM, Ravi Varadhan wrote:

Dear Duncan,

What constitutes a convincing argument for making significant changes?


I don't think there's any answer to that other than "an argument that 
convinces someone to make the changes".  What would convince you to work 
on a problem?   Your answer is very different from mine, and mine is 
different from that of anyone else in the core group.




Taking the example of optimization algorithms (say, for smooth objective
functions), how does one make a convincing argument that a particular class
of algorithms are "better" than another class? This can be a difficult task,
but quite doable with good benchmarking practices.


I don't see how that's relevant.  That's an argument to make to users, 
not to the core group.   A user wants to use the best optimizer for 
his/her own problem.  The core group wants functions in base R that we 
will maintain.



Supposing for the moment that such a convincing argument has been made, is
that sufficient to get the R-core to act upon it?


By definition, yes.



Are there compelling
factors other than just "algorithm A is better than algorithm B"?


Yes.  The decision about whether it belongs in a package or in base R is 
about who should maintain the code.  If I think it is fantastic code, 
but you will do a better job of maintaining it than I will, then there's 
no way I'd put it in base R.



I'd think that the argument is relatively easy if the need for the change is
driven by consumer demand.  But, even here I am not sure how to make an
argument to the R-core to consider the big changes.  For example, there is a
reasonable demand for constrained (smooth) optimization algorithms in R
(based on R-help queries).  Currently, there are only 3 packages that can
handle this.  However, in the base distribution only `constrOptim' function
is provided, which cannot handle anything more than linear, inequality
constraints.  I think that the base distribution needs to have a package for
constrained optimization that can handle linear/nonlinear and
equality/inequality constraints.


As Doug said, "I don't see anything in what you are proposing that could 
not be incorporated in a contributed package."


I think I answered your followup question above:  the rationale for 
including it in base R is because someone in the core team is in a 
better position to maintain the code than an outside package maintainer 
would be.


Duncan Murdoch


John, thanks for raising an important issue.

Thanks&  Best,
Ravi.

---
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns
Hopkins University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu


-Original Message-
From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
On Behalf Of Duncan Murdoch
Sent: Friday, December 03, 2010 11:13 AM
To: nas...@uottawa.ca
Cc: r-devel@r-project.org
Subject: Re: [Rd] Competing with one's own work

On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
>  No, this is not about Rcpp, but a comment in that overly long discussion
raised a question
>  that has been in my mind for a while.
>
>  This is that one may have work that is used in R in the base functionality
and there are
>  improvements that should be incorporated.
>
>  For me, this concerns the BFGS, Nelder-Mead and CG options of optim(),
which are based on
>  the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
methods...", which were
>  themselves derived from other people's work. By the time Brian Ripley took
that work (with
>  permission, even though not strictly required. Thanks!) there were already
some
>  improvements to these same algorithms (mainly bounds and masks) in the
BASIC codes of the
>  1987 book by Mary Walker-Smith and I. However, BASIC to R is not something
I'd wish on
>  anyone.
>
>  Now there are some R packages, including some I've been working on, that
do offer
>  improvements on the optim() offerings. I would not say mine are yet fully
ready for
>  incorporation into the base, but they are pretty close. Equally I think
some of the tools
>  in the base should be deprecated and users encouraged to try other
routines. It is also
>  getting more and more important that novice users be provided with
sensible guidance and
>  robust default settings and choices. In many areas, users are faced with
more choice than
>  is efficient for the majority of problems.
>
>  My question is: How should such changes be suggested / assisted? It seems
to me that this
>  is beyond a simple feature request. Some discussion on pros and cons would
be appropriate,
>  and those like myself who are familiar with particular tools can and
should offer help.
>
>  Alternatively, is there a document available in the style "Writing R
Extensions" that has
>  a title like "How the R Base Packages are Updated"? A brief search was
negative.
>
>  I'm happy to compete with my

Re: [Rd] Competing with one's own work

2010-12-03 Thread Ravi Varadhan
"The decision about whether it belongs in a package or in base R is 
about who should maintain the code."  

Ok.  I understand it now.

Thanks,
Ravi.

---
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns
Hopkins University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu


-Original Message-
From: Duncan Murdoch [mailto:murdoch.dun...@gmail.com] 
Sent: Friday, December 03, 2010 2:19 PM
To: Ravi Varadhan
Cc: nas...@uottawa.ca; r-devel@r-project.org
Subject: Re: [Rd] Competing with one's own work

On 03/12/2010 12:01 PM, Ravi Varadhan wrote:
> Dear Duncan,
>
> What constitutes a convincing argument for making significant changes?

I don't think there's any answer to that other than "an argument that 
convinces someone to make the changes".  What would convince you to work 
on a problem?   Your answer is very different from mine, and mine is 
different from that of anyone else in the core group.


> Taking the example of optimization algorithms (say, for smooth objective
> functions), how does one make a convincing argument that a particular
class
> of algorithms are "better" than another class? This can be a difficult
task,
> but quite doable with good benchmarking practices.

I don't see how that's relevant.  That's an argument to make to users, 
not to the core group.   A user wants to use the best optimizer for 
his/her own problem.  The core group wants functions in base R that we 
will maintain.

> Supposing for the moment that such a convincing argument has been made, is
> that sufficient to get the R-core to act upon it?

By definition, yes.


> Are there compelling
> factors other than just "algorithm A is better than algorithm B"?

Yes.  The decision about whether it belongs in a package or in base R is 
about who should maintain the code.  If I think it is fantastic code, 
but you will do a better job of maintaining it than I will, then there's 
no way I'd put it in base R.

> I'd think that the argument is relatively easy if the need for the change
is
> driven by consumer demand.  But, even here I am not sure how to make an
> argument to the R-core to consider the big changes.  For example, there is
a
> reasonable demand for constrained (smooth) optimization algorithms in R
> (based on R-help queries).  Currently, there are only 3 packages that can
> handle this.  However, in the base distribution only `constrOptim'
function
> is provided, which cannot handle anything more than linear, inequality
> constraints.  I think that the base distribution needs to have a package
for
> constrained optimization that can handle linear/nonlinear and
> equality/inequality constraints.

As Doug said, "I don't see anything in what you are proposing that could 
not be incorporated in a contributed package."

I think I answered your followup question above:  the rationale for 
including it in base R is because someone in the core team is in a 
better position to maintain the code than an outside package maintainer 
would be.

Duncan Murdoch

> John, thanks for raising an important issue.
>
> Thanks&  Best,
> Ravi.
>
> ---
> Ravi Varadhan, Ph.D.
> Assistant Professor,
> Division of Geriatric Medicine and Gerontology School of Medicine Johns
> Hopkins University
>
> Ph. (410) 502-2619
> email: rvarad...@jhmi.edu
>
>
> -Original Message-
> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-project.org]
> On Behalf Of Duncan Murdoch
> Sent: Friday, December 03, 2010 11:13 AM
> To: nas...@uottawa.ca
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] Competing with one's own work
>
> On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
> >  No, this is not about Rcpp, but a comment in that overly long
discussion
> raised a question
> >  that has been in my mind for a while.
> >
> >  This is that one may have work that is used in R in the base
functionality
> and there are
> >  improvements that should be incorporated.
> >
> >  For me, this concerns the BFGS, Nelder-Mead and CG options of optim(),
> which are based on
> >  the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
> methods...", which were
> >  themselves derived from other people's work. By the time Brian Ripley
took
> that work (with
> >  permission, even though not strictly required. Thanks!) there were
already
> some
> >  improvements to these same algorithms (mainly bounds and masks) in the
> BASIC codes of the
> >  1987 book by Mary Walker-Smith and I. However, BASIC to R is not
something
> I'd wish on
> >  anyone.
> >
> >  Now there are some R packages, including some I've been working on,
that
> do offer
> >  improvements on the optim() offerings. I would not say mine are yet
fully
> ready for
> >  incorporation into the base, but they are pretty close. Equally I think
> some of the tools
> >  in the base should be deprecated and users encouraged to try oth

Re: [Rd] Competing with one's own work

2010-12-03 Thread Paul Gilbert
At one time I lobbied for putting something in base or a required package, and 
it was suggested that the idea at the time was to remove things rather than add 
them. Generally, I agree that is a good idea, so I did not lobby more. 

When this question comes up it is always asked, and answered, in terms of 
putting things in. However, is there a process for moving things out to normal 
packages rather than keeping them in required packages or base?

Paul

>-Original Message-
>From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-
>project.org] On Behalf Of Douglas Bates
>Sent: December 3, 2010 1:28 PM
>To: Ravi Varadhan
>Cc: r-devel@r-project.org; nas...@uottawa.ca
>Subject: Re: [Rd] Competing with one's own work
>
>On Fri, Dec 3, 2010 at 11:01 AM, Ravi Varadhan 
>wrote:
>> Dear Duncan,
>
>> What constitutes a convincing argument for making significant changes?
>> Taking the example of optimization algorithms (say, for smooth
>objective
>> functions), how does one make a convincing argument that a particular
>class
>> of algorithms are "better" than another class? This can be a difficult
>task,
>> but quite doable with good benchmarking practices.
>
>> Supposing for the moment that such a convincing argument has been
>made, is
>> that sufficient to get the R-core to act upon it?  Are there
>compelling
>> factors other than just "algorithm A is better than algorithm B"?
>
>> I'd think that the argument is relatively easy if the need for the
>change is
>> driven by consumer demand. But, even here I am not sure how to make an
>> argument to the R-core to consider the big changes.  For example,
>there is a
>> reasonable demand for constrained (smooth) optimization algorithms in
>R
>> (based on R-help queries).  Currently, there are only 3 packages that
>can
>> handle this.  However, in the base distribution only `constrOptim'
>function
>> is provided, which cannot handle anything more than linear, inequality
>> constraints.  I think that the base distribution needs to have a
>package for
>> constrained optimization that can handle linear/nonlinear and
>> equality/inequality constraints.
>
>constrOptim is in the stats package, not the base package.  Functions
>that are already in the required packages are maintained by R core.
>If you know of bugs in such functions you should report them.  Because
>there is a heavy burden in maintaining the large corpus of software in
>R and its required packages, additions are viewed skeptically,
>Adopting new capabilities and new code in a required package like
>stats means that some member of R core has to be willing to maintain
>it.  If the capabilities can be incorporated in a contributed package
>then that is the preferred method of extending R. The burden of
>maintaining the code, fixing bugs or other infelicities, etc. is on
>the package maintainer.
>
>I don't see anything in what you are proposing that could not be
>incorporated in a contributed package.
>
>> John, thanks for raising an important issue.
>>
>> Thanks & Best,
>> Ravi.
>>
>> ---
>> Ravi Varadhan, Ph.D.
>> Assistant Professor,
>> Division of Geriatric Medicine and Gerontology School of Medicine
>Johns
>> Hopkins University
>>
>> Ph. (410) 502-2619
>> email: rvarad...@jhmi.edu
>>
>>
>> -Original Message-
>> From: r-devel-boun...@r-project.org [mailto:r-devel-boun...@r-
>project.org]
>> On Behalf Of Duncan Murdoch
>> Sent: Friday, December 03, 2010 11:13 AM
>> To: nas...@uottawa.ca
>> Cc: r-devel@r-project.org
>> Subject: Re: [Rd] Competing with one's own work
>>
>> On 03/12/2010 10:57 AM, Prof. John C Nash wrote:
>>> No, this is not about Rcpp, but a comment in that overly long
>discussion
>> raised a question
>>> that has been in my mind for a while.
>>>
>>> This is that one may have work that is used in R in the base
>functionality
>> and there are
>>> improvements that should be incorporated.
>>>
>>> For me, this concerns the BFGS, Nelder-Mead and CG options of
>optim(),
>> which are based on
>>> the 1990 edition (Pascal codes) of my 1979 book "Compact numerical
>> methods...", which were
>>> themselves derived from other people's work. By the time Brian Ripley
>took
>> that work (with
>>> permission, even though not strictly required. Thanks!) there were
>already
>> some
>>> improvements to these same algorithms (mainly bounds and masks) in
>the
>> BASIC codes of the
>>> 1987 book by Mary Walker-Smith and I. However, BASIC to R is not
>something
>> I'd wish on
>>> anyone.
>>>
>>> Now there are some R packages, including some I've been working on,
>that
>> do offer
>>> improvements on the optim() offerings. I would not say mine are yet
>fully
>> ready for
>>> incorporation into the base, but they are pretty close. Equally I
>think
>> some of the tools
>>> in the base should be deprecated and users encouraged to try other
>> routines. It is also
>>> getting more and more important that novice users be provided with
>> sensible gui

Re: [Rd] Terminology clarification (Re: GPL and R Community Policies (Rcpp)

2010-12-03 Thread Dominick Samperi
Dirk,

Please let me know whether or not you will comply with my request to remove
references to my name in Rcpp (except copyright notices).

Thanks,
Dominick

On Thu, Dec 2, 2010 at 6:28 PM, Dominick Samperi wrote:

>
>
> On Thu, Dec 2, 2010 at 5:58 PM, Dirk Eddelbuettel  wrote:
>
>>
>> On 2 December 2010 at 17:23, Dominick Samperi wrote:
>> | OK, since you are so accomodating, then please remove all reference to
>> | my name from Rcpp as I do not want to be subject to arbitrary revisions
>> of
>> | my status. I may not have the right to say how my prior work will be
>> used,
>> | but I think I have the right to ask that my name not be used in the way
>> | it is used in the recent update.
>>
>> As I pointed out, you change your mind on this every 12 months, limiting
>> my
>> patience and willingness for these dances.  It has also been suggested by
>> other than attribution is clearer if you listed as the maintainer of the
>> 2005/2006 code that we started from in 2008.
>>
>
> The change that this thread is a reaction to happened a few days ago, not
> 12 months ago. If I wavered in the past it was because I was being
> forced to compete with my own work, not a pleasant place to be.
>
> Are you telling me that you refuse to stop using my name
> in Rcpp (except in copyright notices)?
>
> Are you telling me that you will continue to use my name and
> update the associated status as you see fit, whether or not I
> approve or consent to those changes?
>
> Please answer yes or no.
>
> Thanks,
> Dominick
>
>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Competing with one's own work

2010-12-03 Thread Ben Bolker
Ravi Varadhan  jhmi.edu> writes:

> 
> "The decision about whether it belongs in a package or in base R is 
> about who should maintain the code."  
> 
> Ok.  I understand it now.
> 
> Thanks,
> Ravi.
> 

   A point that may not have been made (sorry if it was and I missed it):

A better question might be how packages get added to the *recommended*
package list (rather than how code gets added to "base R").  Of the
16 recommended packages, 2 are maintained by R-core itself, 12 by various
R-core members acting as individuals (I assume), and 2 by non-R-core
people. It seems that if a contributed package sticks around long enough
and proves itself sufficiently useful and of sufficiently high quality
(and well enough maintained), that it could then be suggested as
a recommended package.

i1 <- installed.packages()
i2 <- i1[!is.na(i1[,"Priority"]),]
ff <- function(x) table(sapply(x[,"Package"],maintainer))
ff(i2[i2[,"Priority"]=="base",])

R Core Team  
12 

ff(i2[i2[,"Priority"]=="recommended",])

   Brian Ripley  
  7 
Deepayan Sarkar  
  2 
 Doug and Martin  
  1 
 Luke Tierney  
  1 
   Martin Maechler  
  1 
  R-core  
  1 
  R-core  
  1 
  Simon Wood  
  1 
   Terry Therneau  
  1

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Problem installing RCurl

2010-12-03 Thread Zhang,Jun
I have 64-bit R 2 12 0 installed on Solaris 10 of Sun Sparc. When I tried to 
install RCurl, it failed with the following lines,

...
Version has CURLOPT_SSL_SESSIONID_CACHE
libcurl version: libcurl 7.19.6
configure: creating ./config.status
config.status: creating src/Makevars
** libs
cc -xc99 -m64 -xarch=sparcvis2 -I/apps/sparcv9/R-2.12.0/lib/R/include 
-I/opt/csw/include -DHAVE_LIBIDN_FIELD=1 -DHAVE_CURLOPT_URL=1 
-DHAVE_CURLINFO_EFFECTIVE_URL=1 .(omitted here is very long, all upper 
case) -DHAVE_CURLOPT_SSL_SESSIONID_CACHE=1 -I/opt/csw/include-KPIC  
-xcode=abs64 -xlibmieee -xtarget=ultra3 -xarch=sparcvis2 -c base64.c -o base64.o
"/opt/csw/include/curl/curlrules.h", line 144: zero or negative subscript
"base64.c", line 25: warning: assignment type mismatch:
pointer to const char "=" pointer to unsigned char
"base64.c", line 39: warning: argument #1 is incompatible with prototype:
prototype: pointer to const char : 
"/apps/sparcv9/R-2.12.0/lib/R/include/Rinternals.h", line 1042
argument : pointer to unsigned char
"base64.c", line 60: warning: assignment type mismatch:
pointer to const char "=" pointer to unsigned char
cc: acomp failed for base64.c
make: *** [base64.o] Error 2
ERROR: compilation failed for package 'RCurl'
* removing '/apps/sparcv9/R-2.12.0/lib/R/library/RCurl'

The downloaded packages are in
'/tmp/Rtmpo67mNX/downloaded_packages'
Updating HTML index of packages in '.Library'
Warning message:
In install.packages("RCurl") :
  installation of package 'RCurl' had non-zero exit status
>

What is the problem?

Jun

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Problem installing RCurl

2010-12-03 Thread Duncan Temple Lang

Hi Jun

On 12/3/10 2:15 PM, Zhang,Jun wrote:
> I have 64-bit R 2 12 0 installed on Solaris 10 of Sun Sparc. When I tried to 
> install RCurl, it failed with the following lines,
> 
> ...
> Version has CURLOPT_SSL_SESSIONID_CACHE
> libcurl version: libcurl 7.19.6
> configure: creating ./config.status
> config.status: creating src/Makevars
> ** libs
> cc -xc99 -m64 -xarch=sparcvis2 -I/apps/sparcv9/R-2.12.0/lib/R/include 
> -I/opt/csw/include -DHAVE_LIBIDN_FIELD=1 -DHAVE_CURLOPT_URL=1 
> -DHAVE_CURLINFO_EFFECTIVE_URL=1 .(omitted here is very long, all 
> upper case) -DHAVE_CURLOPT_SSL_SESSIONID_CACHE=1 -I/opt/csw/include-KPIC  
> -xcode=abs64 -xlibmieee -xtarget=ultra3 -xarch=sparcvis2 -c base64.c -o 
> base64.o
> "/opt/csw/include/curl/curlrules.h", line 144: zero or negative subscript

This error indicates that the compiler (cc with flags -xc99 -m64, etc.) sees 
the size of the 'long' data type in C
is different from what was seen when libcurl was configured, built and 
installed.

So basically the compiler and/or the compiler flags were different.

How was libcurl installed - from source or from a pre-built binary ?
What compiler and flags were used?

  D.


> "base64.c", line 25: warning: assignment type mismatch:
> pointer to const char "=" pointer to unsigned char
> "base64.c", line 39: warning: argument #1 is incompatible with prototype:
> prototype: pointer to const char : 
> "/apps/sparcv9/R-2.12.0/lib/R/include/Rinternals.h", line 1042
> argument : pointer to unsigned char
> "base64.c", line 60: warning: assignment type mismatch:
> pointer to const char "=" pointer to unsigned char
> cc: acomp failed for base64.c
> make: *** [base64.o] Error 2
> ERROR: compilation failed for package 'RCurl'
> * removing '/apps/sparcv9/R-2.12.0/lib/R/library/RCurl'
> 
> The downloaded packages are in
> '/tmp/Rtmpo67mNX/downloaded_packages'
> Updating HTML index of packages in '.Library'
> Warning message:
> In install.packages("RCurl") :
>   installation of package 'RCurl' had non-zero exit status
>>
> 
> What is the problem?
> 
> Jun
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Problem installing RCurl

2010-12-03 Thread Prof Brian Ripley

On Fri, 3 Dec 2010, Duncan Temple Lang wrote:



Hi Jun

On 12/3/10 2:15 PM, Zhang,Jun wrote:

I have 64-bit R 2 12 0 installed on Solaris 10 of Sun Sparc. When I tried to 
install RCurl, it failed with the following lines,

...
Version has CURLOPT_SSL_SESSIONID_CACHE
libcurl version: libcurl 7.19.6
configure: creating ./config.status
config.status: creating src/Makevars
** libs
cc -xc99 -m64 -xarch=sparcvis2 -I/apps/sparcv9/R-2.12.0/lib/R/include 
-I/opt/csw/include -DHAVE_LIBIDN_FIELD=1 -DHAVE_CURLOPT_URL=1 
-DHAVE_CURLINFO_EFFECTIVE_URL=1 .(omitted here is very long, all upper 
case) -DHAVE_CURLOPT_SSL_SESSIONID_CACHE=1 -I/opt/csw/include-KPIC  
-xcode=abs64 -xlibmieee -xtarget=ultra3 -xarch=sparcvis2 -c base64.c -o base64.o
"/opt/csw/include/curl/curlrules.h", line 144: zero or negative subscript


This error indicates that the compiler (cc with flags -xc99 -m64, 
etc.) sees the size of the 'long' data type in C is different from 
what was seen when libcurl was configured, built and installed.


So basically the compiler and/or the compiler flags were different.

How was libcurl installed - from source or from a pre-built binary ?
What compiler and flags were used?


The header is from a prebuilt binary (from OpenCSW).  That is built 
with gcc and not the Sun compiler.  And curlbuild.h says


/* Allow 32 and 64 bit headers to coexist */
#if defined __amd64 || defined __x86_64 || defined __sparcv9
#include "curlbuild-64.h"
#else
#include "curlbuild-32.h"
#endif

which AFAIK are gcc and not Sun defines.  You could try adding 
-D__sparcv9 to the CPPFLAGS, or compile RCurl with OpenCSW's gcc 
build (but 64-bit gcc is another can of worms).


I've pointed out to Jun Zhang several times that 64-bit Sparc Solaris 
is really pushing it, and 32-bit R on Sparc Solaris has been much more 
successful.  Given that x86_64 boxes (Solaris or Linux) are so much 
faster at computation than Sparc ones, I don't see the point of 
building 64-bit Sparc Solaris R -- if 32-bit R is not enough you need 
a faster machine.




 D.



"base64.c", line 25: warning: assignment type mismatch:
pointer to const char "=" pointer to unsigned char
"base64.c", line 39: warning: argument #1 is incompatible with prototype:
prototype: pointer to const char : 
"/apps/sparcv9/R-2.12.0/lib/R/include/Rinternals.h", line 1042
argument : pointer to unsigned char
"base64.c", line 60: warning: assignment type mismatch:
pointer to const char "=" pointer to unsigned char
cc: acomp failed for base64.c
make: *** [base64.o] Error 2
ERROR: compilation failed for package 'RCurl'
* removing '/apps/sparcv9/R-2.12.0/lib/R/library/RCurl'

The downloaded packages are in
'/tmp/Rtmpo67mNX/downloaded_packages'
Updating HTML index of packages in '.Library'
Warning message:
In install.packages("RCurl") :
  installation of package 'RCurl' had non-zero exit status




What is the problem?

Jun

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel