Hello,

I'm trying to track down the cause of some extreme memory usage and I've been 
using Dirk Eddelbuettel's lsos() function he posted on stack overflow. There is 
a large difference between R's RAM usage :

PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
6637 darstr    20   0 30.0g  29g 4712 S    0 63.2  10:34.43 R 

and what objects I have loaded in memory :

> lsos()
           Type      Size PrettySize     Rows Columns
A          list 552387720   526.8 Mb        2      NA
B   GRangesList 552376408   526.8 Mb        4      NA
C SimpleRleList 353421896     337 Mb       24      NA
D       GRanges 236410608   225.5 Mb 15272853      NA
E    data.frame   6981952     6.7 Mb    24966      14
F    data.frame   6782136     6.5 Mb    24966      13
G          list   4393704     4.2 Mb    24964      NA
H        matrix   3195760       3 Mb    24964      16
I          list   1798752     1.7 Mb    24964      NA
J       GRanges    312656   305.3 Kb    24964      NA

(The total looks like about 1.5 GB)

I haven't got any calls to external C code in my R script, although the 
Bioconductor packages I am using do. How can I regain those missing Gigabytes ?

--------------------------------------
Dario Strbenac
Research Assistant
Cancer Epigenetics
Garvan Institute of Medical Research
Darlinghurst NSW 2010
Australia

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to