Hi
I have created a loop to obtain data from several webpages
but the loop keeps crashing with the error 
"Error in function (type, msg, asError = TRUE)  : 
  Operation timed out after 5000 milliseconds with 9196 bytes received"
  
  Page = getURLContent(page[i], followlocation=TRUE, curl = curl,.opts=list(
verbose = TRUE, timeout=5))
  
  I am not sure how to keep the loop running after that error, so any help
would be appreciated
ps: I have played with the timeout option but it eventually crashes..
Thanks



--
View this message in context: 
http://r.789695.n4.nabble.com/Keep-loop-running-after-webpage-times-out-tp4646689.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to