I have a script in a folder that has no relevant execution time limits (I regularly run hour-long scripts from this directory) with a file in it. This file is called from an html form on a codeingiter view and uploads a csv file, reads it and updates database entries with the information contained within.
If I have a smaller csv file (1-1000 entries), it works fine. Anything over that will process about 1000 entries (including echoing debug text onto the screen) and THEN throw a 404 error.
I've since attempted to circumvent the error by making the post page make an ajax call but it does the exact same thing.
"NetworkError: 404 Not Found - xxxx/xxxx/xxxxx.php"
I have confirmed that the script is running prior to the 404 by confirming changes in the database and seeing echo'd debug code on the screen.
Code that is executing when it throws the 404...
(in short, update id (first column) to have value (second column) from csv file.
$file = fopen($targetdir, 'r');
$count = 0;
$x = (fgetcsv($file));
if(is_numeric($x) && $x > 9999 && $x != '-')
// echo "UPDATE xxxx SET xxxxx ='".$x."' WHERE id=".$x;
$db->query("UPDATE xxxx SET xxxxx ='".$x."' WHERE id=".$x);
//echo $x.' - '.$x."<br/>";
I was able to find the setting that was tripping. Rather than use the standard php max execution time, the timeout was getting hit from the FastCGI settings. Either from AJAX posting or from the read operation.
Either way, I increased FcgidIOTimeout in my apache settings on the server and was able to execute the full script.