Hi all,
Just thought I would pick the collective brain on this one. I have
a requirement to deliver a large EXE file to customers after they
order. The file is just under 400 MB in size and, because they have
just purchased it, I obviously cannot have this file in a "public"
location on the web server that someone could browse to.
I can push the file out quite easily using a modified header and a
simple script to check if they can download it or not, but with
such a large file a significant number of web browsers fail to
obtain the entire EXE before closing - or any other number of
factors kick into play (their PC resets, ISP disconnects, Windows
crashes, etc).
Some browsers support resuming download, but not when the file has
been sent via the headers I use, also FTP is not an option as I
cannot create and destroy FTP users on the server easily (or for
that matter assume the customer knows how to perform FTP
operations).
I'm also aware that it's not such a hot idea to lock-up Apache for
the time it takes to download the whole file, especially with a
large number of users doing this.
So I came up with an idea that I'd like your opinions on: I built a
small but friendly Windows application (<50KB in size) that will
connect to the web server via HTTPS, check the download credentials
and if all is ok, it then downloads the file via HTTP in 1MB
chunks. The file is just a single EXE file sat outside of my web
root, and the PHP script that serves the file uses fopen() to open
the file, then fseeks to the required section of it, reads in 1MB
worth of data, closes the file and then echos this out (after
suitable headers of course, shown below)
header('Content-Type: application/force-download');
header('Content-Transfer-Encoding: Binary');
header("Content-Length: $total_chunksize");
header("Content-Disposition: attachment; filename=\"$chunkname\"");
The Windows app performs various checks on the file segments as
they download and eventually stitches the whole thing back together
at the end (there is a "resume download" feature so you can come
back to it at a later time if you need, or your ISP disconnects).
A quick MD5 file integrity check with the server confirms the file has
downloaded fully.
I have tested this out on some massive files across a range of PCs
and Windows installations and it works perfectly, so I'm happy that
the Windows side of things is correct. But I would be interested to
hear peoples views on the PHP side of the equation - would it be
better for Apache to be running PHP scripts that shove out smaller
1MB chunks as opposed to doing a fpassthru on a 300MB+ file? Or do
you think there is another more elegant solution?
I'm aware my app is for Windows only (although I could easily port
it to OS X), but the files they are downloading are PC games
anyway, so it's no bad thing in this case.
Best regards,
Richard Davey
--
http://www.launchcode.co.uk - PHP Development Services
"I am not young enough to know everything." - Oscar Wilde
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php