Dear R users
I am trying to create some new variables for a 4401 x 30 dataframe using
ddply and transform. The "id" variable i am using is a factor with 1330
levles eg
bb <- function(df) {transform(df,
years = study.year - min(study.year) + 1,
periods = length(study.year)
)}
test <- ddply(x,.(id),bb)
I havent copied the data to avoid clogging the list.
The problem is that I get an error "cannot allocate vector of size 128.0
Mb". I wouldnt have thought any of these were particularly large files. Is
there a limit on how many splits ddply can handle (ie 1330 here)? Or - more
likely - am i doing something dumb?
grateful for any help
best
simeon
I am using XP with 3G of RAM, R2.9.1 and most recent ggplot2
> memory.size()
[1] 883.49
> memory.limit()
[1] 2047
> object.size(x) # size of dataframe
1410136 bytes
> gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 218229 5.9 11384284 304.0 15229393 406.7
Vcells 4404372 33.7 77750774 593.2 214556462 1637.0
[[alternative HTML version deleted]]
______________________________________________
[email protected] mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.