Hello Greg,

Monday, December 13, 2004, 9:42:30 PM, you wrote:

GD> Use set_time_limit(0); to prevent the timeout.  ignore_user_abort() is
GD> pretty handy too.

Yeah, I have the time-out limit in there already (the client end will
detect for a time-out from the server as well).

GD> If that doesn't work you might give them authenticated http access
GD> with temporary passwords. You can have the usernames and passwords
GD> in a db and pass the proper auth headers with PHP.

I did think of this, and it gets around a few issues, but if they are
not running any software to manage the download, and are not using a
decent browser (ala FireFox) we still hit the same "cannot resume"
problem should the download abort.

>>    I'm also aware that it's not such a hot idea to lock-up Apache for
>>    the time it takes to download the whole file, especially with a
>>    large number of users doing this.

GD> Apache 2 is pretty good with multiple threads from what I hear.  I use
GD> it but not in a production environment.

Most of our servers run 1.3 - which is perfectly good, no complaints
there, it's just HTTP itself was never even really designed for
extremely large file downloads, so I am wary of any single
server+browser solution.


Best regards,

Richard Davey
-- 
 http://www.launchcode.co.uk - PHP Development Services
 "I am not young enough to know everything." - Oscar Wilde

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to