Hi,
I'd like to know how estimate the memory actually used by some function
call.
The function essentially contains a for loop, which stops after a
variable number of iterations, depending on the input data.
I used Rprof with memory.profiling=TRUE, but the memory results seem to
increase with the
Hi Sean,
I know I'll have to optimize the memory management (maybe using the
proto or R.oo packages), but for the moment I'd like to estimate the
amount of memory actually used by the call.
I got some estimate doing:
g1 <- gc(reset=TRUE)
my.function(input.data)
g2 <- gc();
sum(g1[,6] - g2[,2]);
Hi.
I'm quite near a release for OCaml-R. Before that, however, I want to
make the build system rely on autoconf / automake. So I'm asking myself
and this mailing list whether or not there exists a set of "blessed by
the community" autoconf macros.
For OCaml, there are these ones:
http://rw
L.S.
In the current version of ?p.adjust.Rd, one needs
to scroll down to the examples section to find
confirmation of one's guess that "fdr" is an
alias of "BH".
Please find a patch in attachment which mentions
this explicitly.
Best,
Tobias
32c32
< Hochberg (1995) (\code{"BH"}), and Benjamin
csiro.au> writes:
>
> That is interesting. The first of these, namely
>
> sum(|x_i - y_i|) / sum(x_i + y_i)
>
> is now better known in ecology as the Bray-Curtis distance. Even more
interesting is the typo in Henry &
> Stevens "A Primer of Ecology in R" where the Bray Curtis distance formul