Hi,
I would like to use an exec cgi call on the homepage of a Web site. We
parse all of our html documents to accommodate server side includes. This
exec cgi script would be needed to make some choices, based on file
modification date, as to which files to display in the center of the
page. At most the script would look in a directory and then display two or
three small files that had been modified within the past few hours. But
will my server take a performance hit over if these files were just called
up with include virtual statements, foregoing the ability to display by
modification time? Is there a way to calculate the performance hit? Say
our Web server received up to 500,000 page views daily, would the display
of the homepage be noticeably slowed down? I am assuming an
alarm(30); statement in my perl script would help alot, but I do not how
much. Any thoughts, advice appreciated.
--
Gary Nielson
[EMAIL PROTECTED]
_______________________________________________
Redhat-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-list