If I'm reading in many-megabyte files, is it considered to be more efficient
to read it into an array, then loop over the array? Or is reading a line at
a time okay?
e.g.
**************************************
while (<>) {
# do some process with each line
}
**************************************
or...
**************************************
@lines = <>;
foreach (@lines) {
# do some process with each line
}
**************************************
I realize the second will use more memory, but what's a few megabytes in
today's computers? I'm more worried about the OS having to go back to the
disk a couple hundred-thousand times -- seems like it'd be hard on the disk.
TIA.
- Bryan
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
<http://learn.perl.org/> <http://learn.perl.org/first-response>