I'm trying to import a table into R the file is about 700MB. Here's my first try:
> DD<-read.table("01uklicsam-20070301.dat",header=TRUE) Error: cannot allocate vector of size 15.6 Mb In addition: Warning messages: 1: In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, : Reached total allocation of 1535Mb: see help(memory.size) 2: In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, : Reached total allocation of 1535Mb: see help(memory.size) 3: In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, : Reached total allocation of 1535Mb: see help(memory.size) 4: In scan(file, what, nmax, sep, dec, quote, skip, nlines, na.strings, : Reached total allocation of 1535Mb: see help(memory.size) Then I tried > memory.limit(size=4095) and got > DD<-read.table("01uklicsam-20070301.dat",header=TRUE) Error: cannot allocate vector of size 11.3 Mb but no additional errors. Then optimistically to clear up the workspace: > rm() > DD<-read.table("01uklicsam-20070301.dat",header=TRUE) Error: cannot allocate vector of size 15.6 Mb Can anyone help? I'm confused by the values even: 15.6Mb, 1535Mb, 11.3Mb? I'm working on WinXP with 2 GB of RAM. Help says the maximum obtainable memory is usually 2Gb. Surely they mean GB? The file I'm importing has about 3 million cases with 100 variables that I want to crosstabulate each with each. Is this completely unrealistic? Thanks! Maja -- View this message in context: http://old.nabble.com/Error%3A-cannot-allocate-vector-of-size...-tp26282348p26282348.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.