Sometimes these intermittent "crashes" come from memory misuse, e.g., not 
allocating enough scratch space.  You can sometimes make those coding errors 
cause more consistent problems by calling gctorture(TRUE) before running your 
code.
Here is an example in which it looks like package:gam's lo() is misusing memory 
when its degree argument is 2 - sometimes it
can do 10 iterations, sometimes 3, sometimes 1:
 
  > library(gam)
  Loading required package: splines
  Loaded gam 1.06.2
  
  >  v <- lapply(1:10,function(i){cat(i,""); gam(mpg ~ lo(hp,degree=2), 
data=mtcars)})
  1 2 3 4 5 6 7 8 9 10 >
  >  v <- lapply(1:10,function(i){cat(i,""); gam(mpg ~ lo(hp,degree=2), 
data=mtcars)})
  1 2 3 Error in sys.call() : invalid 'which' argument
  >  v <- lapply(1:10,function(i){cat(i,""); gam(mpg ~ lo(hp,degree=2), 
data=mtcars)})
  1 Error in sys.call() : invalid 'which' argument 

If I call gctorture(TRUE) before calling lo(degree=2) then it hangs on the 
first call.  One could attach a debugger at this point to get a clue about 
where it is failing.

  > library(gam)
  Loading required package: splines
  Loaded gam 1.06.2
  
  >
  > gctorture(TRUE)
  > v <- lapply(1:10,function(i){cat(i,""); gam(mpg ~ lo(hp,degree=2), 
data=mtcars)})
  1

Using valgrind is helpful also.

Bill Dunlap
Spotfire, TIBCO Software
wdunlap tibco.com


> -----Original Message-----
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On 
> Behalf
> Of Steve Powers
> Sent: Thursday, December 27, 2012 7:01 PM
> To: r-help@r-project.org
> Subject: [R] R crashing inconsistently within for loops
> 
> Hello,
> 
> This one has been bugging me for a long time and I have never found a
> solution. I am using R version 2.15.1 but it has come up in older versions of 
> R I have used
> over the past 2-3 years.
> 
> Q: Am I wrong to expect that R should handle hundreds of iterations of the
> base model or statistical functions, embedded within for loops, in one
> script run? I have found that when I write scripts that do this, sometimes
> they have a tendency to crash, seemingly unpredictably.
> 
> For example, one problem script of mine employs glm and gls about a hundred
> different times, and output files are being written at the end of each
> iteration. I have used my output files to determine that the crash cause is
> not consistent (R never fails at the same iteration). Note that the data are
> fixed here (no data generation or randomization steps, so that is not the
> issue). But it is clear that scripts with larger numbers of iterations are
> more likely to produce a crash.
> 
> And a year or two ago, I had a seemingly stable R script again with for
> looped model fits, but discovered this script was prone to crashing when I
> ran it on a newer PC. Because the new PC also seemed to be blazing through R
> code absurdly fast, I tried adding a short "fluff" procedure at the end of
> each iteration that required a few seconds of processing time. Low and
> behold, when I added that, the script stopped crashing (and each iteration
> of course took longer). I still don't understand why that fixed things.
> 
> What is going on? Solutions? Thanks.---steve
> 
> --
> Steve Powers
> power...@nd.edu
> University of Notre Dame
> Environmental Change Initiative
> website (http://www.nd.edu/~spowers2/index.htm)
> 
> ______________________________________________
> R-help@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to