Dear List I am running a loop downloading web pages and save the html to a temporary file (use download.file() ) then read (using readLines) it in for processing; finally write useful info from each processed page to a unique file
the problem is once the loop runs up to somewhere near 5000, it will throw out an err like below and won't go further. ---------------------------------------------------------------- Error in file(file, ifelse(append, "a", "w")) : cannot open the connection ----------------------------------------------------------------- In the meantime, a request for new connection won't be successful, for example, a request for the help page of "file" will trigger err below ----------------------------------------------------------------------- ?file Error in gzfile(file, "rb") : cannot open the connection In addition: Warning message: In gzfile(file, "rb") : cannot open compressed file 'C:/PROGRA~1/R/R-211~1.1/library/stats/help/aliases.rds', probable reason 'Too many open files' ----------------------------------------------------------------------- I am not sure if the problem is too many connections not closed. since I close the file connection after each readLines. checking with showConnections(all=T) does not show excessive connections and closeAllConnections() does not help. Can any one help me on this? Any answer highly appreciated. yong ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.