Re: [Rd] IDE for R C++ package writing ?

2007-02-24 Thread AJ Rossini

> Den Fr, 2007-02-23, 11:49 skrev mel:

> > I will be grateful for all advices on this tool topic.
> > Better choosing emacs ? or code::blocks ?
> > or another idea ?
> > Does somebody have an idea about the most used IDEs for
> > R C++ package writing ?

Emacs has IDE capabilities possible, as extensions.   See the CEDET library 
and ECB package extension for code browsers, UML tools, etc.  JDEE (Java 
development environment for Emacs) is an excellent IDE for Java, SLIME is 
excellent for Common Lisp, but there isn't a truly excellent tool for C++ or 
R at this time.  (ESS IMHO is as good as it gets at present, but let's 
reserve the term "excellent" for things that deserve it, having stood the 
long test of time and design standards, like Emacs and Common Lisp have).

best,
-tony

[EMAIL PROTECTED]
Muttenz, Switzerland.
"Commit early,commit often, and commit in a repository from which we can 
easily
roll-back your mistakes" (AJR, 4Jan05).


pgpPkvKj1YYlY.pgp
Description: PGP signature
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] IDE for R C++ package writing ?

2007-02-24 Thread Grant Izmirlian
On windows the currently maintained version is xemacs...the projcects split 
some time ago.
-- 
Grant Izmirlian
NCI
Հրանդ Իզմիրլյան

On Friday 23 February 2007 06:00, [EMAIL PROTECTED] wrote:
> [Rd] IDE for R C++ package writing ?
> Dear all,
> 
> I have to develop a (hopefully) small package for R in C++.
> I didn't code in C++ for some years, and i'm now searching
> for an adequate IDE for this task.
> 
> Some of my criterions : not proprietary, not too heavy,
> open to linux, not java gasworks, still maintained, etc
> 
> After looking on several places
>http://en.wikipedia.org/wiki/List_of_C%2B%2B_compilers_and_integrated_development_environments
> http://www.freeprogrammingresources.com/cppide.html
> + R docs
> I was thinking on code::blocks, and emacs (and perhaps vim)
> 
> Emacs seems used by some R developers as an R editor.
> So i did think on emacs because it could perhaps be interesting
> to have the same editor for R code and C++ code.
> 
> However, when looking at the last emacs windows version,
> it seems to date from january 2004 ... (dead end ?)
> ftp://ftp.gnu.org/pub/gnu/emacs/windows/
> 
> I will be grateful for all advices on this tool topic.
> Better choosing emacs ? or code::blocks ?
> or another idea ?
> Does somebody have an idea about the most used IDEs for
> R C++ package writing ?
> 
> Thanks
> Vincent

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] IDE for R C++ package writing ?

2007-02-24 Thread Daniel Nordlund

> -Original Message-
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf
> Of Hin-Tak Leung
> Sent: Friday, February 23, 2007 5:40 AM
> To: mel
> Cc: r-devel@r-project.org
> Subject: Re: [Rd] IDE for R C++ package writing ?
> 
> I don't know if ess runs under xemacs, but historically,

I have used ess under Xemacs on Windows.  I think John Fox still has some 
documents and files available on the web for setting up Xemacs with ess.  You 
can try searching the R-help archives or probably even just google Fox and 
Xemacs.

Hope this is helpful,

Dan Nordlund
Bothell, WA  USA 

> xemacs (a fork of the emacs code) had windows support earlier than
> gnu emacs did, and obviously, it is still being worked on
> as the last version is December 2006.
> 
> http://www.xemacs.org/Download/win32/
> 
> HTH

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Depending on many packages: another best practice question

2007-02-24 Thread Dirk Eddelbuettel

On 23 February 2007 at 19:38, hadley wickham wrote:
| ggplot currently requires 13 packages (grid, reshape, RColorBrewer,
| proto, splines, MASS, Hmisc, boot, butler, hexbin, mapproj, quantreg,
| sm).  Some of these are absolutely necessary (eg. proto), but most are
| used for one or two specific tasks (eg. boot is only used to get
| plogis, used for logit scales).
| 
| Do you think I should make them all "depends" packages, or "suggests"
| packages, and then manually test for package presence before using a
| certain function?  What is easier for users?

Rcmdr uses to have hard Depends. Given that I maintained Rcmdr in Debian, I
had to add a lot of additional packages to Debian only to cover Rcmdr's build
requirements.   

John later changed that to Suggests

   Depends: R (>= 2.1.0), tcltk, grDevices, utils
   Suggests: abind, car (>= 1.2-1), effects (>= 1.0-7), foreign, grid,
   lattice, lmtest, MASS, mgcv, multcomp (>= 0.991-2), nlme, nnet, relimp,
   rgl, RODBC 

which he then tests for in Startup.R.  I think Graham's rattle does something
similar.

I think you will get confused users either way as it is impossible to please
all the people all the time. Foolprof methods only attract smarter fools.

Hope this helps, Dirk

-- 
Hell, there are no rules here - we're trying to accomplish something. 
  -- Thomas A. Edison

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Functions that write functions in R packages

2007-02-24 Thread hadley wickham
I'm trying to make wrappers to proto functions (eg. GeomPoint$new())
so that user don't notice that they're using a proto function (ie. use
geom_point()) instead.  I'm hoping I can wrap proto up sufficiently
that only developers need to worry that ggplot uses a completely
different oo system.

Hadley

On 2/23/07, Gabor Grothendieck <[EMAIL PROTECTED]> wrote:
> Not sure what the setup is here but if the objects are
> intended to be proto objects then the accessor functions
> could be placed in the object itself (or in an ancestor object)
> rather than in the global environment.  For example, this inserts
> a function get.v(.) into proto object p for each variable v in p.
>
> library(proto)
>
> make.accessors <- function(p, ...) {
>lapply(ls(p, ...), f. <- function(v) {
> nm <- paste("get", v, sep = ".")
> p[[nm]] <- function(.) {}
> body(p[[nm]]) <- substitute(.$v, list(v = v))
> environment(p[[nm]]) <- p
>})
>invisible(p)
> }
> make.accessors(p)
> p$get.x()
> p$get.y()
>
> # or the constructor of objects like p could build it right it
> # at object construction time
> make.p <- function(...) make.accessors(proto(...))
> q <- make.p(x = 1, y = 2)
> q$get.x()
> q$get.y()
>
>
> On 2/23/07, hadley wickham <[EMAIL PROTECTED]> wrote:
> > Dear all,
> >
> > Another question related to my ggplot package:  I have made some
> > substantial changes to the backend of my package so that plot objects
> > can now describe themselves much better.  A consequence of this is
> > that a number of convenience functions that previously I wrote by
> > hand, can now be written automatically.  What is the best practice for
> > creating these functions for bundling in a package?  I see three
> > possible solutions:
> >
> >  * dump function specifications out to a .r file
> >  * dynamically create at package build time so they are including in
> > the package rdata file
> >  * dynamically create at package load time
> >
> > Can anyone offer any advice as to which is preferable? (or if there's
> > a better way I haven't thought of)
> >
> > My code currently looks like this (experimenting with two ways of
> > creating the functions)
> >
> > create_accessors <- function(objects, name, short=NULL) {
> >lapply(objects, function(x) {
> >assign(paste(name, x$objname, sep="_"), x$new, 
> > pos=globalenv())
> >if (!is.null(short)) {
> >eval(
> >substitute(
> >f <- function(plot, ...) plot + 
> > add(...),
> >list(
> >add = as.name(paste(name, 
> > x$objname, sep="_")),
> >f = as.name(paste(short, 
> > x$objname, sep=""))
> >)
> >), envir = globalenv()
> >)
> >
> >}
> >})
> > }
> >
> > Thanks,
> >
> > Hadley
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Functions that write functions in R packages

2007-02-24 Thread Gabor Grothendieck
How about something like this where we put the accessors in
.GlobalEnv at object construction time in this example but you
could alternately place them into package:ggplot or elsewhere on
the search path:

library(proto)

make.accessors <- function(p, e = p, ...)
  lapply(ls(p, ...), function(v) {
 if (is.function(get(v, p))) e[[v]] <- do.call("$.proto", list(p, v))
  invisible(p)
})
p <- proto(x = function(.) 1, y = function(.) 2)
make.accessors(p, .GlobalEnv)
x()
print(x)
y()
print(y)
rm(x, y)

# or the constructor of objects like p could build it right it
# at object construction time
make.p <- function(..., e = .GlobalEnv) make.accessors(proto(...), e = e)
q <- make.p(x = function(.) 1, y = function(.) 2)
x()
print(x)
y()
print(y)









On 2/24/07, hadley wickham <[EMAIL PROTECTED]> wrote:
> I'm trying to make wrappers to proto functions (eg. GeomPoint$new())
> so that user don't notice that they're using a proto function (ie. use
> geom_point()) instead.  I'm hoping I can wrap proto up sufficiently
> that only developers need to worry that ggplot uses a completely
> different oo system.
>
> Hadley
>
> On 2/23/07, Gabor Grothendieck <[EMAIL PROTECTED]> wrote:
> > Not sure what the setup is here but if the objects are
> > intended to be proto objects then the accessor functions
> > could be placed in the object itself (or in an ancestor object)
> > rather than in the global environment.  For example, this inserts
> > a function get.v(.) into proto object p for each variable v in p.
> >
> > library(proto)
> >
> > make.accessors <- function(p, ...) {
> >lapply(ls(p, ...), f. <- function(v) {
> > nm <- paste("get", v, sep = ".")
> > p[[nm]] <- function(.) {}
> > body(p[[nm]]) <- substitute(.$v, list(v = v))
> > environment(p[[nm]]) <- p
> >})
> >invisible(p)
> > }
> > make.accessors(p)
> > p$get.x()
> > p$get.y()
> >
> > # or the constructor of objects like p could build it right it
> > # at object construction time
> > make.p <- function(...) make.accessors(proto(...))
> > q <- make.p(x = 1, y = 2)
> > q$get.x()
> > q$get.y()
> >
> >
> > On 2/23/07, hadley wickham <[EMAIL PROTECTED]> wrote:
> > > Dear all,
> > >
> > > Another question related to my ggplot package:  I have made some
> > > substantial changes to the backend of my package so that plot objects
> > > can now describe themselves much better.  A consequence of this is
> > > that a number of convenience functions that previously I wrote by
> > > hand, can now be written automatically.  What is the best practice for
> > > creating these functions for bundling in a package?  I see three
> > > possible solutions:
> > >
> > >  * dump function specifications out to a .r file
> > >  * dynamically create at package build time so they are including in
> > > the package rdata file
> > >  * dynamically create at package load time
> > >
> > > Can anyone offer any advice as to which is preferable? (or if there's
> > > a better way I haven't thought of)
> > >
> > > My code currently looks like this (experimenting with two ways of
> > > creating the functions)
> > >
> > > create_accessors <- function(objects, name, short=NULL) {
> > >lapply(objects, function(x) {
> > >assign(paste(name, x$objname, sep="_"), x$new, 
> > > pos=globalenv())
> > >if (!is.null(short)) {
> > >eval(
> > >substitute(
> > >f <- function(plot, ...) plot + 
> > > add(...),
> > >list(
> > >add = as.name(paste(name, 
> > > x$objname, sep="_")),
> > >f = as.name(paste(short, 
> > > x$objname, sep=""))
> > >)
> > >), envir = globalenv()
> > >)
> > >
> > >}
> > >})
> > > }
> > >
> > > Thanks,
> > >
> > > Hadley
> > >
> > > __
> > > R-devel@r-project.org mailing list
> > > https://stat.ethz.ch/mailman/listinfo/r-devel
> > >
> >
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R crashes in Mac OS

2007-02-24 Thread Simon Urbanek

On Feb 18, 2007, at 2:08 PM, Giusi Moffa wrote:

> I am running R under Mac OS X (Tiger, v10.4). Unfortunately it keeps
> crashing while using the editor. One thing that seems to make things
> worse is having more than one script open at the same time. Can
> anyone help?
>

Please send me a full crash report. Also please try to find a  
reproducible example if you can.

Thanks,
Simon

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Depending on many packages: another best practice question

2007-02-24 Thread Prof Brian Ripley
On Fri, 23 Feb 2007, hadley wickham wrote:

> Dear all,
>
> ggplot currently requires 13 packages (grid, reshape, RColorBrewer,
> proto, splines, MASS, Hmisc, boot, butler, hexbin, mapproj, quantreg,
> sm).  Some of these are absolutely necessary (eg. proto), but most are
> used for one or two specific tasks (eg. boot is only used to get
> plogis, used for logit scales).

Hmm, there is no plogis in boot, but there is in stats.

> Do you think I should make them all "depends" packages, or "suggests"
> packages, and then manually test for package presence before using a
> certain function?  What is easier for users?

The second, especially as from 2.5.0 the 'Depends' and 'Imports' are 
installed by default.

What you have not mentioned is that those packages also have dependencies.

Using 'Depends' on a non-CRAN package (e.g. hexbin) is definitely awkward 
for the user/sysadmin and I would try to avoid it.

I've been here with ggplot for my GGobi class, and a smaller set of
would have been helpful.  Do you really need reshape, for example?

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Depending on many packages: another best practice question

2007-02-24 Thread hadley wickham
> > ggplot currently requires 13 packages (grid, reshape, RColorBrewer,
> > proto, splines, MASS, Hmisc, boot, butler, hexbin, mapproj, quantreg,
> > sm).  Some of these are absolutely necessary (eg. proto), but most are
> > used for one or two specific tasks (eg. boot is only used to get
> > plogis, used for logit scales).
>
> Hmm, there is no plogis in boot, but there is in stats.

Oops, I had originally included it to use logit and inv.logit, but
then realised I could use plogis etc instead.  That's one dependency
down.

>
> > Do you think I should make them all "depends" packages, or "suggests"
> > packages, and then manually test for package presence before using a
> > certain function?  What is easier for users?
>
> The second, especially as from 2.5.0 the 'Depends' and 'Imports' are
> installed by default.

Ok, that makes sense then.

> What you have not mentioned is that those packages also have dependencies.
>
> Using 'Depends' on a non-CRAN package (e.g. hexbin) is definitely awkward
> for the user/sysadmin and I would try to avoid it.

It's frustrating enough for the windows user, as suggests packages
seem to get installed by default as well.

> I've been here with ggplot for my GGobi class, and a smaller set of
> would have been helpful.  Do you really need reshape, for example?

Reshape powers the facetting, so is essential.  It also reflects my
philosophy of separating data manipulation from display, as opposed to
lattice which often combines the two (there are advantages and
disadvantages to both, of course)

I think the minimum I need is grid, reshape, and proto, none of which
have any further dependencies.

Hadley

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Depending on many packages: another best practice question

2007-02-24 Thread Prof Brian Ripley
On Sat, 24 Feb 2007, hadley wickham wrote:

>> > ggplot currently requires 13 packages (grid, reshape, RColorBrewer,
>> > proto, splines, MASS, Hmisc, boot, butler, hexbin, mapproj, quantreg,
>> > sm).  Some of these are absolutely necessary (eg. proto), but most are
>> > used for one or two specific tasks (eg. boot is only used to get
>> > plogis, used for logit scales).
>> 
>> Hmm, there is no plogis in boot, but there is in stats.
>
> Oops, I had originally included it to use logit and inv.logit, but
> then realised I could use plogis etc instead.  That's one dependency
> down.
>
>> 
>> > Do you think I should make them all "depends" packages, or "suggests"
>> > packages, and then manually test for package presence before using a
>> > certain function?  What is easier for users?
>> 
>> The second, especially as from 2.5.0 the 'Depends' and 'Imports' are
>> installed by default.
>
> Ok, that makes sense then.
>
>> What you have not mentioned is that those packages also have dependencies.
>> 
>> Using 'Depends' on a non-CRAN package (e.g. hexbin) is definitely awkward
>> for the user/sysadmin and I would try to avoid it.
>
> It's frustrating enough for the windows user, as suggests packages
> seem to get installed by default as well.

Prior to 2.5.0, yes, but not in future.

>
>> I've been here with ggplot for my GGobi class, and a smaller set of
>> would have been helpful.  Do you really need reshape, for example?
>
> Reshape powers the facetting, so is essential.  It also reflects my
> philosophy of separating data manipulation from display, as opposed to
> lattice which often combines the two (there are advantages and
> disadvantages to both, of course)
>
> I think the minimum I need is grid, reshape, and proto, none of which
> have any further dependencies.
>
> Hadley
>

-- 
Brian D. Ripley,  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel:  +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UKFax:  +44 1865 272595

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel