readfile() reads 8k blocks at a time and dumps them out. It does not read the entire thing into ram, so that wouldn't be what was causing you to hit a memory limit. You must have done something else wrong then.
-Rasmus On Fri, 4 Oct 2002, christian haines wrote: > thanks rasmus, > > i have tried read file but it gave me the same issues as fpassthru.. both cap > on the memory_limit directive withint the php.ini file > > any other suggestions maybe? > > cheers > christian > > Rasmus Lerdorf wrote: > > > readfile() > > > > On Fri, 4 Oct 2002, christian haines wrote: > > > > > hi all, > > > > > > i have successfully created a download script to force a user to > > > download, however attempting to download large files causes an error > > > saying that the file cannot be found. > > > > > > my code > > > > header("Cache-control: private"); > > > header("Content-Type: application/force-download; name=\"$file\""); > > > header("Content-Disposition: attachment; filename=\"$file \""); > > > header("Content-Transfer-Encoding: binary"); > > > header("Content-Length: $content_length"); > > > $fp = fopen($file_fullpath,"r"); > > > fpassthru($fp); > > > fclose($fp); > > > < my code > > > > > > this is a memory issue in the php.ini i.e. memory_limit = 8M then the > > > largest file i can download is 8M > > > > > > is there anyway to "force" a download without having to use the system > > > hungry fpassthru function? > > > > > > this is driving me nuts so any help would be greatly appreciated > > > > > > cheers > > > christian > > > > > > ps i read the following at php.net fpassthru man page but could not make > > > sense of it (it appears to be some kind of solution) > > > > > > > "fpassthru() works best for small files. In download manager scripts, > > > it's best to determine the URL of the file to download (you may generate > > > it locally in your session data if you need so), and then use HTTP > > > __temporary__ redirects (302 status code, with a "Location:" header > > > specifying the effective download URL). > > > This saves your web server from maintaining PHP scripts running for long > > > times during the file downloadn and instead the download will be managed > > > directly by the web server without scripting support (consequence: less > > > memory resources used by parallel downloads)..." > > > > > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > -- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php