I see no bug here. R is telling you that you have insufficient memory available: see also ?"Memory-limits".

See also rw-FAQ Q2.9.

Note that we don't have a reproducible example and in particular have no idea of how many columns this data frame has.

The 'R Data Import/Export' manual gives you many hints on how to do this more efficiently, and it is referenced from the help page. I see no sign that you have consulted it, so doing your homework is your next (belated) step.


On Sun, 17 Aug 2008, [EMAIL PROTECTED] wrote:

Hello

I am running Windows Vista 32 with 4 GB (installed, though Windows of cours=
e only recognizes 3326 MB, as reported by Windows "My Computer")

And only 2Gb of that is available to a user process.

I am running R 2.7.1

I was trying to read in a comma delimited single column CSV file, assign th=
at file to a variable ("data") and then extract a sample (assigned to "part=
ial").  I was getting memory allocation errors, and from the log below, I s=
tarted to see a pattern which indicates that the "error" is related to how =
R views memory allocation, and perhaps how R is using or reading memory on =
my specific system.  The error, I would guess, is probably related to how R=
is reporting the memory available.

My source files have the same number of rows (or cases) as the name of the =
file.  Thus, data10000 has 10,000 observations, and data300000 has 300,000 =
observations.

Here is my history which shows the problem, and inconsistency in memory all=
ocation reporting:

 > data=3Dread.csv("data10000.csv",header=3DFALSE)
 > partial=3Dsample(data,5000,T)
 > data=3Dread.csv("data100000.csv",header=3DFALSE)
 Error in file(file, "r") : cannot open the connection
 In addition: Warning message:
 In file(file, "r") :
   cannot open file 'data100000.csv': No such file or directory
 > data=3Dread.csv("data300000.csv",header=3DFALSE)
 > partial=3Dsample(data,5000,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial=3Dsample(data,5,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial=3Dsample(data,1,T)
 Warning messages:
 1: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial=3Dsample(data,1,T)
 > partial=3Dsample(data,5000,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial[1:5,]
 [1]  0.3204279  1.6583593 -0.3456585  1.2951363 -1.1096974
 > rm(partial)
 > partial=3Dsample(data,5000,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 5: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 6: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial[1:5,]
 Error: object "partial" not found
 > partial=3Dsample(data,1,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial=3Dsample(data,1,T)
 Warning messages:
 1: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In attributes(.Data) <- c(attributes(.Data), attrib) :
   Reached total allocation of 1535Mb: see help(memory.size)
 > partial=3Dsample(data,1,T)
 > partial=3Dsample(data,5000,T)
 Error: cannot allocate vector of size 2.3 Mb
 In addition: Warning messages:
 1: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 2: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 3: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 4: In names(y) <- make.unique(cols) :
   Reached total allocation of 1535Mb: see help(memory.size)
 >=20


I have been able to reproduce the situation several times.  For example, wh=
at is curious is that a command like partial=3Dsample(data,1,T) may or may =
not produce a warning message.  In the history, obviously 2.3MB is well bel=
ow the allocated 1535MB.  I had been getting this error on my original larg=
er source data (1,000,000 observations), but since I am a new R user, it wa=
s not clear to me what these errors mean.

If you need the original source files, or have additional questions, please=
let me know.  I am a certified .NET software developer, and although I do =
not know everything about Windows development, I believe my abilities are a=
bove average.



Mark Tabladillo=20
Atlanta, GA
[EMAIL PROTECTED]
Alternate:  [EMAIL PROTECTED]


        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


--
Brian D. Ripley,                  [EMAIL PROTECTED]
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to