Hi

It looks like we have hit a bug in RCurl. The method getURL seems to be
leaking memory. A simple test case to reproduce the bug is given here:

>library(RCurl)
>handle<-getCurlHandle()
>range<-1:100
>for (r in range) {x<-getURL(url="news.google.com.au",curl=handle)}

If I run this code, the memory allocated to the R session is never
recovered.

We are using RCurl for some long running experiments and we are running out
of memory on the test system.

The specs of our test system are as follows:
OS: Ubuntu 14.04 (64 bit)
Memory: 24 GB
RCurl version: 1.95-4.3

Any ideas about how to get around this issue?

Thanks

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to