On 07/28/2010 06:13 AM, Edwin Husni Sutanudjaja wrote:
Dear all,

I have a memory problem in making a scatter plot of my 17.5 million-pair
datasets.
My intention to use the "ggplot" package and use the "bin2d". Please find the
attached script for more details.

Could somebody please give me any clues or tips to solve my problem?? please ...
Just for additional information: I'm running my R script on my 32-bit machine:
Ubuntu 9.10, hardware: AMD Athlon Dual Core Processor 5200B, memory: 1.7GB.

Many thanks in advance.
Kind Regards,

You should try to get access to a fairly robust 64bit machine, say in the range of >=8GiB real memory and see what you can do. No chance on a 32 bit machine. No chance on a 64 bit machine without sufficient real memory (you will be doomed to die by swap). Does your institution have a virtualization lab with the ability to allocate machines with large memory footprints? There is always Amazon EC2. You could experiment with sizing before buying that new workstation you've had your eye on.

Alternatively, you might take much smaller samples of your data and massively decrease the size of the working set. I assume this is not want you want though.

Mark

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to