I am trying to fetch huge amount of data from one mysql table to be exported as XLSX file.
I used fetchAll() function, but I got
Fatal error: Out of memory
$query = "SELECT *
$header = array('DATA','STORE','FROM','TO','DATE','YEAR','MONTH','ITEM','SIZE','DEPT','SUBDEPT','DESC1','DESC2','GENDER','ATTR','VEND','SEASON','INVO#','TRANS#','QTY','MSRP','RTP','COST','T.RTP','T.COST','PAYMENT','STATUS');
$result = $mDb->fetchAll(); // Here where I get the error!
$fileName = "fullReport-" . date('m-d-Y-H-i-s') . ".xlsx";
$path = "_uploads/" . $fileName;
$writer = WriterFactory::create(Type::XLSX); // for XLSX files
$writer->openToFile($path); // write data to a file or to a PHP stream
$writer->openToBrowser($path); // stream data directly to the browser
foreach ($result as $value)
$valuex = array_values($value);
fetchAll is the problem. What it does is get all the matching rows from the table and load everything in memory. It works when you don't have too many rows to fetch but causes Out of Memory errors when the number of rows to be stored exceed the available memory amount.
To fix this problem, you should fetch your data in multiple chunks. You can use the
fetch method instead and a cursor. It is well documented on the PHP.net manual. You can also take at this repo: https://github.com/adrilo/spout-pdo-example. It gives you an example for using MySQL and Spout together, in a scalable way.