Dear gbm users,
When running predict.gbm() on a "large" dataset (150,000 rows, 300 columns,
500 trees), I notice that the memory used by R grows beyond reasonable
limits. My 14GB of RAM are often not sufficient. I am interpreting this as a
memory leak since there should be no reason to expand memory needs once the
data are loaded and passed to predict.gbm() ?


Running R version 2.9.2 on Linux, gbm package 1.6-3.

Thanks,

Markus

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to