All of the memory allocations for predictions use R's allocVector(). See
gbm_pred in gbmentry.cpp. That is, R's memory manager is doing all the
work. gbm_pred is not allocating memory separate from R. It's just
creating R objects within R that can be deleted or garbage collected.

 

Make sure your gbm object does not have a "Terms" component. I don't
think it should if you called gbm.fit() directly. Without a Terms
component it will not run model.frame() and shouldn't really take so
much memory.

 

If you want to experiment with the latest development version you can
find it at https://r-forge.r-project.org/projects/gbm/

 

Greg

 

________________________________

From: Markus Loecher [mailto:markus.loec...@gmail.com] 
Sent: Friday, October 30, 2009 6:17 AM
To: r-help@r-project.org
Cc: Ridgeway, Greg
Subject: possible memory leak in predict.gbm(), package gbm ?



Dear gbm users,
When running predict.gbm() on a "large" dataset (150,000 rows, 300
columns, 500 trees), I notice that the memory used by R grows beyond
reasonable limits. My 14GB of RAM are often not sufficient. I am
interpreting this as a memory leak since there should be no reason to
expand memory needs once the data are loaded and passed to predict.gbm()
?


Running R version 2.9.2 on Linux, gbm package 1.6-3.

Thanks,

Markus


__________________________________________________________________________

This email message is for the sole use of the intended r...{{dropped:9}}

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to