Grigio Grigio - 4 months ago 27
PHP Question

How to Stream files into a Zip from AWS S3

I'm using the PHP Flysystem package to stream content from my AWS S3 bucket. In particular, I'm using


My Question

When I stream a file, it ends up in and the size is correct, but when unzip it, it become Here is my prototype:

header('Pragma: no-cache');
header('Content-Description: File Download');
header('Content-disposition: attachment; filename=""');
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
$s3 = Storage::disk('s3'); // Laravel Syntax
echo $s3->readStream('directory/file.jpg');

What am I doing wrong?

Side Question

When I stream a file like this, does it:

  1. get fully downloaded into my server's RAM, then get transferred to the client, or

  2. does it get saved - in chunks - in the buffer, and then get transferred to the client?

Basically, is my server being burdened if I have have dozens of GB's of data being streamed?


You are currently dumping the raw contents of the directory/file.jpg as the zip (which a jpg is not a zip) . You need to create a zip file with those contents.

Instead of

echo $s3->readStream('directory/file.jpg');

Try the following in its place using the Zip extension:

// use a temporary file to store the Zip file
$zipFile = tmpfile();
$zipPath = stream_get_meta_data($zipFile)['uri'];
$jpgFile = tmpfile();
$jpgPath = stream_get_meta_data($jpgFile)['uri'];

// Download the file to disk
stream_copy_to_stream($s3->readStream('directory/file.jpg'), $jpgFile);

// Create the zip file with the file and its contents
$zip = new ZipArchive();
$zip->addFile($jpgPath, 'file.jpg');

// export the contents of the zip

Using tmpfile and stream_copy_to_stream, it will download it in chunks to a temporary file on disk and not into RAM