You can use 'rm' to remove objects. Are you remembering to do a 'detach' after the 'attach'? Why are you using 'attach' (I personally avoid it)? Think about using 'with'. Take a look at the size of the objects you are working with (object.size) to understand where you might have problems. Use 'search' to see what still might be attached. I think that as long as something is 'attached' memory is not freed up after 'rm' until you do the 'detach'
On Tue, Jun 30, 2009 at 5:15 AM, gug <guygr...@netvigator.com> wrote: > > Hello, > > Is there a command for freeing up the memory used by R in holding data > tables? > > The structure of the procedure I have is as follows: > > 1) Read multiple txt files in using read.table(...). > 2) Combine the read tables using rbind(...). > 3) Attach the data using attach(...) and then use do a multiple regression > using lm(...). > > So far so good, but when I then perform a further regression by taking out > factors, I run into memory issues, getting warnings such as: > > "1: In as.list.data.frame(X) : > Reached total allocation of 1535Mb: see help(memory.size)" > > As it stands, I have to close and then restart R, read in the same data > again and run with the new reduced number of factors. > > My thinking was that, if I could reclaim the memory held by the already > read > data files, keeping only the result of the rbind process, I could avoid the > duplication. I have therefore tried (very amateurishly) to reset the read > data to zero using: > > Read_data_1=(0) > Read_data_2=(0)... etc > > Followed by: > > gc() > > However this doesn't get solve the problem. Is there a better way of > getting R to "forget" the data tables it was holding and free up the > memory? > > For info: I am also specifying colClasses when first reading the data in, > to > try to make it more memory-efficient (following: > http://www.biostat.jhsph.edu/~rpeng/docs/R-large-tables.html). > > Other alternatives are trying the 3GB switch (XP Home, with 4GB RAM). > Another alternative is trying to use the sqldf package to bring the data > in, > which one poster very helpfully suggested in response to an earlier > question. I may end up trying that, but as I have not used SQL, I am a > little daunted by the prospect. > > I would really appreciate any suggestions. Thanks. > > Guy Green > -- > View this message in context: > http://www.nabble.com/Clearing-out-or-reclaiming-memory-tp24268680p24268680.html > Sent from the R help mailing list archive at Nabble.com. > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide > http://www.R-project.org/posting-guide.html<http://www.r-project.org/posting-guide.html> > and provide commented, minimal, self-contained, reproducible code. > -- Jim Holtman Cincinnati, OH +1 513 646 9390 What is the problem that you are trying to solve? [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.