On Tue, Mar 4, 2008 at 7:18 PM, Prof Brian Ripley <[EMAIL PROTECTED]>
wrote:
> A 64-bit version of R would be able to handle this (preferably with more
> RAM), but you don't appear to have one. Check what .Machine$size.pointer
> says: I expect 4.
>
> On Tue, 4 Mar 2008, Randy Griffiths wrote:
>
>
A 64-bit version of R would be able to handle this (preferably with more
RAM), but you don't appear to have one. Check what .Machine$size.pointer
says: I expect 4.
On Tue, 4 Mar 2008, Randy Griffiths wrote:
> Hello All,
>
> I have a very large data set (1.1GB) that I am trying to read into R.
What type of data do you have? Will it be numeric or factors? If it
is all numeric, then you will need over 4GB just to hold one copy of
the object (700,000 * 800 * 8). That is to hold the final object; I
don't know how much additional space is required during the
processing.
What are you going
Hello All,
I have a very large data set (1.1GB) that I am trying to read into R. The
file is tab delimited and contains headers; there are over 800 columns and
almost 700,000 rows. I am using the Ubuntu 7.10 Gutsy Gibbon version of R. I
am using Kernel Linux 2.6.22-14-generic. I have 3.1GB of RAM
4 matches
Mail list logo