Dear R users,

I have been using a dynamic data extraction from raw files strategy at
the moment, but it takes a long long time.
In order to save time, I am planning to generate a data set of size
1500 x 20000 with each data point a 9-digit decimal number, in order
to save my time.
I know R is limited to 2^31-1 and that my data set is not going to
exceed this limit. But my laptop only has 2 Gb and is running 32-bit
Windows / XP or Vista.

I ran into R memory problem issue before. Please let me know your
opinion according to your experience.
Thanks a lot!

- John

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to