Hi Yongchao, Yongchao Ge <[EMAIL PROTECTED]> writes: > Why am I storing a large dataset in the R? My program consist of two > parts. The first part is to get the intermediate results, the computation > of which takes a lot of time. The second part contains many > different functions to manipulate the the intermediate > results. > > My current solution is to save intermediate result in a temporary file, > but my final goal is to to save it as an R object. The "memory leak" in > .Call stops me from doing this and I'd like to know if I can have a clean > solution for the R package I am writing.
There are many examples of packages that use .Call to create large objects. I don't think there is a "memory leak". One thing that may be catching you up is that because of R's pass-by-value semantics, you may be ending up with multiple copies of the object on the R side during some of your operations. I would recommend recompiling with --enable-memory-profiling and using tracemem() to see if you can identify places where copies of your large object are occurring. You can also take a look at Rprof(memory.profile=TRUE). + seth -- Seth Falcon | Computational Biology | Fred Hutchinson Cancer Research Center BioC: http://bioconductor.org/ Blog: http://userprimary.net/user/ ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel