I monitored the usage of memory on a script that I ran. It ran 30K
regressions and it stores p-values for one of the
coefficients. It read in a file that has 3000 rows and about 30K
columns. The size of the file is about 170 MB.

My understanding is that memory usage started out at 2.2G and went up to 23G:


cpu=00:03:08, mem=172.75822 GBs, io=0.00000, vmem=2.224G, maxvmem=2.224G
cpu=00:42:35, mem=29517.64894 GBs, io=0.00000, vmem=23.612G, maxvmem=23.612G

I know very little about how memory works, but I thought the hardest
part would be reading the file in. Could
someone explain why there is such a substantial increase over the
course of the script.

Thanks,

Juliet

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to