Re: [R] Memory usage in read.csv()

2010-01-20 Thread nabble . 30 . miller_2555
Hi Jim & Gabor - Apparently, it was most likely a hardware issue (shortly after sending my last e-mail, the computer promptly died). After buying a new system and restoring, the script runs fine. Thanks for your help! On Tue, Jan 19, 2010 at 2:02 PM, jim holtman - jholt...@gmail.com <+nabble+m

Re: [R] Memory usage in read.csv()

2010-01-19 Thread Gabor Grothendieck
You could also try read.csv.sql in sqldf. See examples on sqldf home page: http://code.google.com/p/sqldf/#Example_13._read.csv.sql_and_read.csv2.sql On Tue, Jan 19, 2010 at 9:25 AM, wrote: > I'm sure this has gotten some attention before, but I have two CSV > files generated from vmstat and f

Re: [R] Memory usage in read.csv()

2010-01-19 Thread jim holtman
I read vmstat data in just fine without any problems.  Here is an example of how I do it: VMstat <- read.table('vmstat.txt', header=TRUE, as.is=TRUE) vmstat.txt looks like this: date time r b w swap free re mf pi po fr de sr intr syscalls cs user sys id 07/27/05 00:13:06 0 0 0 27755440 13051648

[R] Memory usage in read.csv()

2010-01-19 Thread nabble . 30 . miller_2555
I'm sure this has gotten some attention before, but I have two CSV files generated from vmstat and free that are roughly 6-8 Mb (about 80,000 lines) each. When I try to use read.csv(), R allocates all available memory (about 4.9 Gb) when loading the files, which is over 300 times the size of the ra