On 27/11/2009 3:36 PM, Alexander Søndergaard wrote:
Hi all,

I'm new to R. Having a functional background, I was wondering what's
the idiomatic way to iterate. It seems that for loops are the default
given there's no tail-call optimization.

I'm curious to know whether there is a way to transform the following
toy snippet into something that doesn't eat up gigabytes of memory
(like it's for loop counterpart) using laziness:

Reduce('+', seq(1,1e6))

I believe the iterators and foreach packages give ways to iterate without creating the whole array, so they might do what you want. But is the allocation really a problem on modern computers? The for loop version of your example, i.e.

total <- 0
for (i in seq(1,1e6)) total <- total + i
total

uses about 4 megabytes of memory, not "gigabytes". If you increase the limit from 1e6 to 1e9 you'll get gigabytes, but it'll probably take tens of minutes to finish running on a system that can run it: the big problem is the interpreted looping, not the memory use.

R has always followed the strategy of making it easy to jump out to C code when speed matters, so the natural idiom for your problem is to use sum(), not to program it yourself. In more complicated examples, or if you really do want to sum the integers from 1 to a billion on a system with limited memory, you'll have to write the C yourself, but it's not hard.

Duncan Murdoch


Thanks!

Best regards,
A.S.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to