IMHO bad idea to use a web script to process log files of these size (please ignore this comment if you are using the command line version).
There are several good open source tools for parsing the apache log files (analog, webalizer, awstats to name a few). These are very fast and designed to handle large files that are generated by heavy traffic sites. You might want to look into it. Some of these log tools can produce a 'machine readable' as well.
finally 100MB chunks wouldn't be a problem. even 1.2gb wouldn't be a problem if you had raid and at least 512MB of memory.
Pablo Gosse wrote:
<snip>
What kind of a log file are we talking here? regardless what processing you need to do generally working on a 1.2GB file with out RAID and/or lots of memory is going to be slow.
Pablo Gosse wrote:
</snip>Hi folks. Has anyone encountered any problems parsing large log files with PHP?
I've got a log file that's about 1.2 gig that I need to parse.
Can PHP handle this or am I better of breaking this down into 12 100mb
chunks and processing it?
It's an Apache log file.
I'm going to have to parse this file outside of the web server, probably on my desktop machine. It's a Dell Precision with 1GB RAM running RH9 with Apache and PHP 4.2.2.
If I can get the log file broken down into 100MB chunks I assume this would not be a problem?
I've not attempted to deal with the file yet as I didn't know how PHP would react to a 1.2 gig file, and I'm in the final stages of a very important project and cannot afford any downtime.
I assume PHP can handle 100MB chunks without choking.
Cheers and TIA.
Pablo
-- Raditha Dissanayake. ------------------------------------------------------------------------ http://www.radinks.com/sftp/ | http://www.raditha.com/megaupload Lean and mean Secure FTP applet with | Mega Upload - PHP file uploader Graphical User Inteface. Just 150 KB | with progress bar.
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php