dev02 dev02 - 3 months ago 7
PHP Question

Run PHP for longer time in separate processes

I have a directory which can contain CSV files that come through a service that I need to import into database. These CSV files are 1000 rows each and can be 10 to 150 files.

I want to insert data of all these CSV files into database. The problem is that PHP dies because of timeout issue because even if I use

set_time_limit(0)
, the server (siteground.com) imposes its restrictions. Here is the code:

// just in case even though console script should not have problem
ini_set('memory_limit', '-1');
ini_set('max_input_time', '-1');
ini_set('max_execution_time', '0');
set_time_limit(0);
ignore_user_abort(1);
///////////////////////////////////////////////////////////////////

function getRow()
{
$files = glob('someFolder/*.csv');

foreach ($files as $csvFile) {
$fh = fopen($csvFile, 'r');

$count = 0;
while ($row = fgetcsv($fh)) {
$count++;

// skip header
if ($count === 1) {
continue;
}

// make sure count of header and actual row is same
if (count($this->headerRow) !== count($row)) {
continue;
}

$rowWithHeader = array_combine($this->headerRow, $row);

yield $rowWithHeader;
}
}
}

foreach(getRow() as $row) {
// fix row
// now insert in database
}


This is actually a Command run through
artisan
(I am using Laravel). I know that CLI doesn't have time restrictions but for some reason not all CSV files get imported and process ends at certain point of time.

So my question is is there way to invoke separate PHP process for each CSV file present in a directory ? Or some other way of doing this so I am able to import all CSV files without any issue like PHP's
generator,
etc

Answer

You could just do some bash magic. refactor your script so that it processes one file only. The file to process is an argument to the script, access it by using $argv.

<?php
// just in case even though console script should not have problem
ini_set('memory_limit', '-1');
ini_set('max_input_time', '-1');
ini_set('max_execution_time', '0');
set_time_limit(0);
ignore_user_abort(1);
$file = $argv[1]; // file is the first and only argument to the script
///////////////////////////////////////////////////////////////////

function getRow($csvFile)
{
    $fh = fopen($csvFile, 'r');

    $count = 0;
    while ($row = fgetcsv($fh)) {
        $count++;

        // skip header
        if ($count === 1) {
            continue;
        }

        // make sure count of header and actual row is same
        if (count($this->headerRow) !== count($row)) {
            continue;
        }

        $rowWithHeader = array_combine($this->headerRow, $row);

        yield $rowWithHeader;
    }
}

foreach(getRow($file) as $row) {
   // fix row
   // now insert in database
}

Now, call your script like this:

for file in `ls /path/to/folder | grep csv`; do php /path/to/your/script.php /path/to/folder/$file; done

This will execute your script for each .csv file in your /path/to/folder