dev02 dev02 - 7 months ago 42
SQL Question

PHP Getting out of memory

I am trying to insert data from postgres database into mysql database. There are about

records that I need to import. However Iam always getting out of memory issue.

Out of memory (allocated 1705508864) (tried to allocate 222764 bytes)

I am using Laravel 5 to do this, here is code:

// to avoid memory limit or time out issue
ini_set('memory_limit', '-1');
ini_set('max_input_time', '-1');
ini_set('max_execution_time', '0');

// this speeds up things a bit

$importableModels = [
// array of table names

$failedChunks = 0;

foreach ($importableModels as $postGresModel => $mysqlModel) {

$total = $postGresModel::count();
$chunkSize = getChunkSize($total);

// customize chunk size in case of certain tables to avoid too many place holders error
if ($postGresModel === 'ApplicationFormsPostgres') {
$chunkSize = 300;

$class = 'App\\Models\\' . $mysqlModel;
$object = new $class;

// trucate prev data //
DB::statement('SET FOREIGN_KEY_CHECKS=0;');
DB::statement('SET FOREIGN_KEY_CHECKS=1;');

$postGresModel::chunk($chunkSize, function ($chunk) use ($postGresModel, $mysqlModel, $failedChunks, $object) {

// make any adjustments
$fixedChunk = $chunk->map(function ($item, $key) use ($postGresModel) {

$appendableAttributes = $postGresModel::APPEND_FIELDS;
$attributes = $item->getAttributes();

// replace null/no values with empty string
foreach ($attributes as $key => $attribute) {
if ($attribute === null) {
$attributes[$key] = '';

// add customized attributes and values
foreach ($appendableAttributes as $appendField) {
if ($appendField === 'ssn') {
$value = $attributes['number'];
$attributes[$appendField] = substr($value, 0, 4);
} else {
$attributes[$appendField] = '';


return $attributes;

// insert chunk of data in db now
if (!$object->insert($fixedChunk->toArray())) {


Memory issue comes when about
rows are inserted not before that.

I suspect something is wrong with collection
function or loops inside the map function. I have even tried setting memory setting and time limit settings to unlimited but to no avail. May be I need to use reference variables or something but I am not sure how.

Can any optimizations be made in above code to reduce memory usage?

Or how do I efficiently import large data from large PostgreSQL database to MySQL through code ?

Can anyone tell what I am doing wrong here or why whole memory gets consumed up ?

PS: I am doing this on local development machine which has 4GB ram (Windows 8). PHP version: 5.6.16


Definitely, you've got a memory leak somewhere. I guess somewhere within $chunk->map(), or $object->insert($fixedChunk->toArray()). We can only guess, because the implementation is hidden.

However, I would use generators as much as possible. The code might look something like the following:

function getAllItems() {
  $step = 2000;

  for ($offset = 0 ;; $offset += $step) {
    $q = "SELECT * FROM items_table LIMIT $offset, $step";

    if (! $items = Db::fetchAll($q)) {

    foreach ($items as $i) {
      yield $i;

foreach (getAllItems() as $item) {

I dare to say that with generators you'll be able to import practically any amount of data from one database to another.