Download large files avoiding out of memory errors

wp-memory-error

Recently, I was working on a site for one of my customers and I was asked to fix an issue they had uploading and downloading large image files ( 9M+ ).

The issue

The customer has a shopping cart that allows the user to download ZIP files of the items purchased. User can add as many products as they want and file sizes can get very large. We used PHP to generate the ZIP file and to secure the file downloads.

One solution, but not the right one.

At the beginning the script was using the fileread() function, this is really a bad solution because it tries to read the whole file into memory and it can eat-up all the server’s resources.

Right solution, but badly executed.

The idea is to read the file in chucks and not to load all the data into RAM. There are many solutions that you can find if you do a Google search, but most of them have some problems. Here is the list.

  • fileread( ) data chuck is increased on every pass instead of getting the next data and the end you will still be reading the whole file into RAM
  • fileread( ) data chuck need to be calculated on every pass using the whole file data as references.
  • ob_clean( )  Not clearing the buffer before starting to send the data file. Browser crashes or creates corrupt file.
  • header  Not sending the right header information. It produces corrupt files.
  • CURL  a better solution but not all servers have the CURL extension installed and implementation is a bit more complicated. Or you can run into the problems mentioned above.

The way to go.

Here is the code that will address all the issues mention above. The comment on the code will explain each step. Also be sure to turn off all caches on these files, this is another issue we ran into while working on a client’s site.

$filepath = "/lightbox-3.zip";

//stop if file does not exist
if( !file_exists( $filepath ) )
	die( );

//open file resource
$fh = fopen( $filepath, 'rb' );

//clear all previous buffers start a new one
ob_end_clean( ); ob_start( );

//fix zip file corruption, if you are using a file type
// other then zip add the file type
header( 'Content-Type:' );
header( 'Pragma: public' );

//use x-sendfile if browsers supports it 
header( 'X-Sendfile: ' . $filepath );
header( 'Content-Description: File Transfer' );
header( 'Content-Transfer-Encoding: binary' );
header( 'Content-Disposition: attachment; filename=' . basename( $filepath ) );

//send header information
ob_clean( ); flush( ); 

//avoid timeout
set_time_limit(  0 );

//read file line by line  / avoid data calculation
while ( !feof( $fh ) ) {
   echo  fgets( $fh );
   ob_flush();
   flush();
}

//close file resource
@fclose( $zippath );
exit( );

);

2 Comments on “Download large files avoiding out of memory errors

  1. Haxsays:

    You will have to modify the script to calculate the size of the file first, and you have to set the range header to know how much data has been transferred to resume the download.


Leave a Reply

Your email address will not be published. Required fields are marked *