Gary, you will definately take a big hit if it's a perl script. Why can't
you run a cron job that updates symlinks as necessary (or something)?
If you explain the problem you are trying to solve I'm sure we can come up
with something more efficient. Got php available?
charles
On Sat, 6 Jan 2001, Gary Nielson wrote:
> Hi,
>
> I would like to use an exec cgi call on the homepage of a Web site. We
> parse all of our html documents to accommodate server side includes. This
> exec cgi script would be needed to make some choices, based on file
> modification date, as to which files to display in the center of the
> page. At most the script would look in a directory and then display two or
> three small files that had been modified within the past few hours. But
> will my server take a performance hit over if these files were just called
> up with include virtual statements, foregoing the ability to display by
> modification time? Is there a way to calculate the performance hit? Say
> our Web server received up to 500,000 page views daily, would the display
> of the homepage be noticeably slowed down? I am assuming an
> alarm(30); statement in my perl script would help alot, but I do not how
> much. Any thoughts, advice appreciated.
>
> --
> Gary Nielson
> [EMAIL PROTECTED]
haha. visited garynielson.com.
_______________________________________________
Redhat-list mailing list
[EMAIL PROTECTED]
https://listman.redhat.com/mailman/listinfo/redhat-list