Sorry, I listed the wrong package in the header of my previous post!

 

>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 

 

Dear List,

 

has anyone of you experienced a significant increase in the time it takes to
parse an URL via "htmlTreeParse()" when this function is called repeatedly
every minute over a couple of hours?

 

Initially, a single parse takes about 0.5 seconds on my machine (Quad Core,
2.67 GHz, 8 MB RAM, Windows 7 64 Bit), . After some time, this can go up to
15 seconds or more.

 

I've tried garbage collect, "catalogClearTable()" (though I don't think that
has anything to do with the issue) and lately wondered if it maybe had to do
with the accumulation of errors over time ("xmlErrorCumulator()"). Are
parsing errors cumulated globally in the workspace over distinct calls to
this function? If so, is there a way to "clean the buffer"?

 

I would greatly appreciate if anyone had an idea about how to keep
request/parsing time fairly constant at the initial low level of 0.5
seconds.

 

Thanks a lot,

Janko


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to