Dear all, I am working with large matrices (19.6 million elements * 1000 simulations) and am trying to get around memory problems and vector length issues. I’ve split the inputs so that the output vector length will not exceed 2^31. Working on a 64bit machine with 80GB RAM, I still get close to the memory limits when allowing output to be held in memory (as would be expected).
Originally, I planned to write the output to a file after it was produced in memory. It took 15min for the output to be produced, but now it's been working on writing it to a file for almost an hour (and ongoing). Is the recommended way to manage large output like this to write it directly to a file? Can that be done as the output is produced, so that memory usage does not build (i.e., so it’s not storing it in memory)? Is that what Sink is designed to do? I’ve been trying to find information about this on the help archive as well as the R http://stat.ethz.ch/R-manual/R-devel/doc/manual/R-data.html Data Import/Export Manual but it’s still not clear to me. Many thanks for your guidance. Some more info: >dim(stPte801) NULL #it's a vector > length(stPte801) [1] 1965705000 >write(stPte801, "stPte801.txt", sep="\n") #it's been writing for almost an hour... #eventually, I will need to pull it back into R to do the next step (but after the other variables are created) -- View this message in context: http://r.789695.n4.nabble.com/writing-output-directly-to-file-sink-tp4506432p4506432.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.