You should be using something like
open(FILE, $file) or die "$!\n";
while(<FILE>){
## do something
}
close FILE;
__END__
if you use something like
local $/;
$contents = <FILE>;
__END__
then you are mistaken...
my perlscripts go up to almost a gig of mem sometimes (foolish yes), but
quick to write it! ;)
-----Original Message-----
From: Brett W. McCoy [mailto:[EMAIL PROTECTED]]
Sent: Thursday, February 07, 2002 3:49 PM
To: Brian Hayes
Cc: [EMAIL PROTECTED]
Subject: Re: memory issues reading large files
On Thu, 7 Feb 2002, Brian Hayes wrote:
> Hello all. I need to read through a large (150 MB) text file line by
> line. Does anyone know how to do this without my process swelling to
> 300 megs?
As long as you aren't reading that file into an array (which would be a
foolish thing to do, IMHO), I don't see why the process would swell to 300
megs.
-- Brett
http://www.chapelperilous.net/
------------------------------------------------------------------------
- long f_ffree; /* free file nodes in fs */
+ long f_ffree; /* freie Dateiknoten im Dateisystem */
-- Seen in a translation
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
----------------------------------------------------------------------------
--------------------
The views and opinions expressed in this email message are the sender's
own, and do not necessarily represent the views and opinions of Summit
Systems Inc.
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]