On Tue, Jan 20, 2009 at 7:27 PM, Tim Arnold wrote:
> I had the same problem you did, but then I changed the code to create a new
> soup object for each file.That drastically increased the speed. I don't
> know why, but it looks like the soup object just keeps getting bigger with
> each feed.
>
>
"Philip Semanchuk" wrote in message
news:[email protected]...
>
> On Jan 19, 2009, at 3:12 AM, S.Selvam Siva wrote:
>
>> Hi all,
>>
>> I am running a python script which parses nearly 22,000 html files
>> locally
>> stored using BeautifulSoup.
>> The problem is
S.Selvam Siva wrote:
Hi all,
I am running a python script which parses nearly 22,000 html files
locally stored using BeautifulSoup.
The problem is the memory usage linearly increases as the files are
being parsed.
When the script has crossed parsing 200 files or so, it consumes all the
availa
On Jan 19, 2009, at 3:12 AM, S.Selvam Siva wrote:
Hi all,
I am running a python script which parses nearly 22,000 html files
locally
stored using BeautifulSoup.
The problem is the memory usage linearly increases as the files are
being
parsed.
When the script has crossed parsing 200 files