Joel Joel Binks Joel Joel Binks - 1 month ago 14
PHP Question

Running php script as cron job - timeout issues?

I am coding a php script that does some back end stuff and needs to run every 8 hours or so. The script takes a while to execute. For the hell of it, I tried it from my browser and the connection to the server gets reset well before the script terminates. My question is - if I run it directly, ie. php -a file.php as a cron job, are there any internal time constraints on execution? This script may take 2-5 minutes to complete and cannot be interrupted. I've never done this before so I am not sure if php has quirks when running heavy scripts.

Answer

As said before, CLI scripts by default have no time limit.

But I would also like to mention an alternative to your cron job approach:
You can fork a CLI PHP script from a PHP script under webserver control. I have done this many times. It is especially useful if you have a script with long execution time which must be triggered by some website user action (e.g. building a very large archive file and send a download link by email when the file is complete). I usually fork a CLI script from a webserver PHP script using the popen() function. This allows to nicely transfer parameters to the new script instance like this:

$bgproc = popen('php "/my/path/my-bckgrnd-proc.php"', 'w');
if($bgproc===false){
  die('Could not open bgrnd process');
}else{
  // send params through stdin pipe to bgrnd process:
  $p1 = serialize($param1);
  $p2 = serialize($param2);
  $p3 = serialize($param3);
  fwrite($bgproc, $p1 . "\n" . $p2 . "\n" . $p3 . "\n");
  pclose($bgproc);
}

In the CLI script you would receive these params like this...

$fp = fopen('php://stdin', 'r');
$param1 = unserialize(fgets($fp));
$param2 = unserialize(fgets($fp));
$param3 = unserialize(fgets($fp));
fclose($fp);

...and do anything with them that would take to long under webserver control.

This technique works equally well in *nix and Windows environments.

Comments