On Sun, Nov 4, 2012 at 1:06 PM, nima82 <nima.mehrafs...@gmail.com> wrote: > Dear all, > > I am puzzled by R's memory usage when calling the blackboost function from > package mboost to estimate a Gradient boosting model on a simulated dataset > with 20 correlated variables and 100,000 obs. The blackboost object created > by the function is only 15.3Mb, but R's memory usage increases by about > 3.9Gb during the estimation of the model and the memory is not released even > after calling the garbage collection with gc() or saving and reloading the > workspace to a new R session. I wonder what is causing this behavior and if > there is a way to free up the extra memory? I appreciate any thoughts since > I would really like to use this function. > > I already posted a similar question on stackoverflow > <http://stackoverflow.com/questions/13195733/how-can-i-remove-invisible-objects-form-an-r-workspace-that-are-not-removed-by-g> > , however haven't gotten any solutions yet. >
Hi Nima, Looking briefly over this, it seems that there's possibly a memory allocation error somewhere in the C routines called by mboost. I'd suggest you contact the package maintainer directly with a reproducible example. To get contact info, use the maintainer() function. Cheers, Michael ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.