CairoCoder CairoCoder - 1 month ago 18
MySQL Question

Handling fetchAll from huge mysql query without memoy limit error

I am trying to fetch huge amount of data from one mysql table to be exported as XLSX file.

I used fetchAll() function, but I got

Fatal error: Out of memory

Here's my code:

require_once 'classes/Spout/Autoloader/autoload.php';
use Box\Spout\Writer\WriterFactory;
use Box\Spout\Common\Type;

$query = "SELECT *
FROM full_report";


$result = $mDb->fetchAll(); // Here where I get the error!

$fileName = "fullReport-" . date('m-d-Y-H-i-s') . ".xlsx";
$path = "_uploads/" . $fileName;

$writer = WriterFactory::create(Type::XLSX); // for XLSX files
$writer->openToFile($path); // write data to a file or to a PHP stream
$writer->openToBrowser($path); // stream data directly to the browser

foreach ($result as $value)
$valuex[] = array_values($value);

Any suggestions?


fetchAll is the problem. What it does is get all the matching rows from the table and load everything in memory. It works when you don't have too many rows to fetch but causes Out of Memory errors when the number of rows to be stored exceed the available memory amount.

To fix this problem, you should fetch your data in multiple chunks. You can use the fetch method instead and a cursor. It is well documented on the manual. You can also take at this repo: It gives you an example for using MySQL and Spout together, in a scalable way.