Dear all,

does anyone of you know how to increase Rs sensitivity to errors? I just 
migrated back from Matlab and really enjoyed there that Matlab just pops up 
with (really helpful!) error messages as soon as there is anything slightly 
wrong with my code. This is certainly anoying on the first run, but really 
helps to uncover some hidden bugs in the Code. Now I tried artificially to 
create errors in R to understand the try() function. I did not hardly manage to 
create one (surprisingly!). It would help if at least things like this would 
create errors:

-division by zero
-calculating mean/stdev/max etc. of arrays containing NAs
-using arrays to index which contain NAs or Infs
-....

I tried hard:

x<-1:100
x[20]=NA
y=100:1

y[NA]
y[200]
y[0.5]
1/x[20]
y[x]

The last 4 lines did not produce any error, just NAs or empty arrays.


Is there any way to change this?

My problem is that I am running large loops over a huge set of timeseries that 
are so different in size and amount of NAs, that is is hard to figure out all 
possible errors beforehand (If I could do so, most probably I could already 
publish a paper about my series straight away :-) )



Thanks for your help!

__________________________________________________
Do You Yahoo!?
Sie sind Spam leid? Yahoo! Mail verfügt über einen herausragenden Schutz gegen 
Massenmails. 
http://mail.yahoo.com

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to