Hello, This one has been bugging me for a long time and I have never found a solution. I am using R version 2.15.1 but it has come up in older versions of R I have used over the past 2-3 years.
Q: Am I wrong to expect that R should handle hundreds of iterations of the base model or statistical functions, embedded within for loops, in one script run? I have found that when I write scripts that do this, sometimes they have a tendency to crash, seemingly unpredictably. For example, one problem script of mine employs glm and gls about a hundred different times, and output files are being written at the end of each iteration. I have used my output files to determine that the crash cause is not consistent (R never fails at the same iteration). Note that the data are fixed here (no data generation or randomization steps, so that is not the issue). But it is clear that scripts with larger numbers of iterations are more likely to produce a crash. And a year or two ago, I had a seemingly stable R script again with for looped model fits, but discovered this script was prone to crashing when I ran it on a newer PC. Because the new PC also seemed to be blazing through R code absurdly fast, I tried adding a short "fluff" procedure at the end of each iteration that required a few seconds of processing time. Low and behold, when I added that, the script stopped crashing (and each iteration of course took longer). I still don't understand why that fixed things. What is going on? Solutions? Thanks.---steve -- Steve Powers power...@nd.edu University of Notre Dame Environmental Change Initiative website (http://www.nd.edu/~spowers2/index.htm) ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.