Bram Vanroy Bram Vanroy - 2 years ago 70
Javascript Question

Efficient and user-friendly way to present slow-loading results

I have read many similar questions concerning cancelling a POST request with jQuery, but none seem to be close to mine.

I have your everyday form that has a PHP-page as an action:

<form action="results.php">
<input name="my-input" type="text">
<input type="submit" value="submit">

on the server-side, based on the post information given in the form, takes a long time (30 seconds or even more and we expect an increase because our search space will increase as well in the coming weeks). We are accessing a Basex server (version 7.9, not upgradable) that contains all the data. A user-generated XPath code is submitted in a form, and the action url then sends the XPath code to the Basex server which returns the results. From a usability perspective, I already show a "loading" screen so users at least know that the results are being generated:

$("form").submit(function() {

<div id="overlay"><p>Results are being generated</p></div>

However, I would also want to give users the option to press a button to cancel the request and cancel the request when a user closes the page. Note that in the former case (on button click) this also means that the user should stay on the same page, can edit their input, and immediately re-submit their request. It is paramount that when they cancel the request, they can also immediately resend it: the server should really abort, and not finish the query before being able to process a new query.

I figured something like this:

$("form").submit(function() {
$("#overlay button").click(abortRequest);

function abortRequest() {
// abort correct request

<div id="overlay">
<p>Results are being generated</p>

But as you can see, I am not entirely sure how to fill in
to make sure the post request is aborted, and terminated, so that a new query can be sent. Please fill in the blanks! Or would I need to
the form submission and instead do an ajax() call from jQuery?

As I said I also want to stop the process server-side, and from what I read I need
for this. But how can I
another PHP function? For example, let's say that in
I have a processing script and I need to exit that script, would I do something like this?

if (isset($_POST['my-input'])) {
$input = $_POST['my-input'];
function processData() {
// A lot of processing

if (isset($_POST['terminate'])) {
function terminateProcess() {
// exit processData()

and then do a new ajax request when I need to terminate the process?

$("#overlay button").click(abortRequest);

function abortRequest() {
url: 'results.php',
data: {terminate: true},
type: 'post',
success: function() {alert("terminated");});

I did some more research and I found this answer. It mentions connection_aborted() and also session_write_close() and I'm not entirely sure which is useful for me. I do use SESSION variables, but I don't need to write away values when the process is cancelled (though I would like to keep the SESSION variables active).

Would this be the way? And if so, how do I make one PHP function terminate the other?

I have also read into Websockets and it seems something that could work, but I don't like the hassle of setting up a Websocket server as this would require me to contact our IT guy who requires extensive testing on new packages. I'd rather keep it to PHP and JS, without third party libraries other than jQuery.

Considering most comments and answers suggest that what I want is not possible, I am also interested to hear alternatives. The first thing that comes to mind is paged Ajax calls (similar to many web pages that serve search results, images, what-have-you in an infinite scroll). A user is served a page with the X first results (e.g. 20), and when they click a button "show next 20 results" those are shown are appended. This process can continue until all results are shown. Because it is useful for users to get all results, I will also provide a "download all results" option. This will then take very long as well, but at least users should be able to go through the first results on the page itself. (The download button should thus not disrupt the Ajax paged loads.) It's just an idea, but I hope it gives some of you some inspiration.

Answer Source

Himel Nag Rana demonstrated how to cancel a pending Ajax request. Several factors may interfere and delay subsequent requests, as I have discussed earlier in another post.

TL;DR: 1. it is very inconvenient to try to detect the request was cancelled from within the long-running task itself and 2. as a workaround you should close the session (session_write_close()) as early as possible in your long-running task so as to not block subsequent requests.

connection_aborted() cannot be used (yet, stay tuned). This function is supposed to be called periodically during a long task (typically, inside a loop). Unfortunately there is just one single significant, atomic operation in your case: the query to the data back end.

If you applied the procedures advised by Himel Nag Rana and myself, you should now be able to cancel the Ajax request and immediately allow a new requests to proceed. The only concern that remains is that the previous (cancelled) request may keep running in the background for a while (without blocking the user, just wasting resources on the server).

The problem could be rephrased to "how to abort a specific process from the outside".

As Christian Bonato rightfully advised, here is a possible implementation. For the sake of the demonstration I will rely on Symphony's Process component, but you can devise a simpler custom solution if you prefer.

The basic approach is:

  1. Spawn a new process to run the query, save the PID in session. Wait for it to complete, then return the result to the client

  2. If the client aborts, it signals the server to just kill the process.

<?php // query.php

use Symfony\Component\Process\PhpProcess;


if(isset($_SESSION['queryPID'])) {
    // A query is already running for this session
    // As this should never happen, you may want to raise an error instead
    // of just silently  killing the previous query.
    posix_kill($_SESSION['queryPID'], SIGKILL);

$queryString = parseRequest($_POST);

$process = new PhpProcess(sprintf(
    '<?php $result = runQuery(%s); echo fetchResult($result);',

$_SESSION['queryPID'] = $process->getPid();

$result = $process->getOutput();
echo formatResponse($result);


<?php // abort.php


if(isset($_SESSION['queryPID'])) {

    $pid = $_SESSION['queryPID'];
    posix_kill($pid, SIGKILL);
    echo "Query $pid has been aborted";

} else {

    // there is nothing to abort, send a HTTP error code
    header($_SERVER['SERVER_PROTOCOL'] . ' 599 No pending query', true, 599);



// javascript
function abortRequest(pendingXHRRequest) {
        url: 'abort.php',
        success: function() { alert("terminated"); });

Spawning a process and keeping track of it is genuinely tricky, this is why I advised using existing modules. Integrating just one Symfony component should be relatively easy via Composer: first install Composer, then the Process component (composer require symfony/finder).

A manual implementation could look like this (beware, this is untested, incomplete and possibly unstable, but I trust you will get the idea):

<?php // query.php


    $queryString = parseRequest($_POST); // $queryString should be escaped via escapeshellarg()

    $processHandler = popen("/path/to/php-cli/php asyncQuery.php $queryString", 'r');

    // fetch the first line of output, PID expected
    $pid = fgets($processHandler);
    $_SESSION['queryPID'] = $pid;

    // fetch the rest of the output
    while($line = fgets($processHandler)) {
        echo $line; // or save this line for further processing, e.g. through json_encode()


<?php // asyncQuery.php

    // echo the current PID
    echo getmypid() . PHP_EOL;

    // then execute the query and echo the result
    $result = runQuery($argv[1]);
    echo fetchResult($result);

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download