26

For example, there is a very simple PHP script which updates some tables on database, but this process takes a long time (maybe 10 minutes). Therefore, I want this script to continue processing even if the user closed the browser, because sometimes users do not wait and they close the browser or go to another webpage.

1
  • So if i want to de-couple the browser request from the execution of the job , how can i request another php script ,while user's current request continues normaly . So , when user request any url , this request will launch another script (which works independently from users browser and other requests). And user will continue to surf on the site normally Commented Nov 1, 2009 at 22:09

8 Answers 8

22

If the task takes 10 minutes, do not use a browser to execute it directly. You have lots of other options:

  • Use a cronjob to execute the task periodically.
  • Have the browser request insert a new row into a database table so that a regular cronjob can process the new row and execute the PHP script with the appropriate arguments
  • Have the browser request write a message to queue system, which has a subscriber listening for such events (which then executes the script).

While some of these suggestions are probably overkill for your situation, the key, combining feature is to de-couple the browser request from the execution of the job, so that it can be completed asynchronously.

If you need the browser window updated with progress, you will need to use a periodically-executed AJAX request to retrieve the job status.

Sign up to request clarification or add additional context in comments.

4 Comments

So if i want to de-couple the browser request from the execution of the job , how can i request another php script ,while user's current request continues normaly . So , when user request any url , this request will launch another script (which works independently from users browser and other requests). And user will continue to surf on the site normally How can i do this ?
See those bullet points? Those are all ways to do this. In most cases, setting up a little database table that holds inputs for the background process works fine. Then you can kick off some PHP script to execute the jobs, either via cron, or some other method.
You stick a job record on the queue in the database. That record can contain any data the user gave you. The cron job looks at the queue, grabs the next job, reads the arguments the user gave, and then uses those inputs to do its job. Read David's second bullet point carefully. It's exactly what you want to do.
how can i execute the php URL (not script) which was recorded to a new row in database , because i use framework. so regular cron job can process
19

To answer your question directly, see ignore_user_abort

More broadly, you probably have an architecture problem here.

If many users can initiate this stuff, you'll want the web application to add jobs to some kind of queue, and have a set number of background processes that chew through all the work.

Comments

6

The PHP script will keep running after the client terminates the connection (not doing so would be a security risk), but only up to max_execution_time (set in php.ini or through a PHP script, generally 30 seconds by default)..

For example:

<?php
    $fh = fopen("bluh.txt", 'w');
    for($i=0; $i<20; $i++) {
        echo $i."<br/>";
        fwrite($fh,$i."\n");
        sleep(1);
    }
    fclose($fh);
?>

Start running that in your browser and close the browser before it completes. You'll find that after 20 seconds the file contains all of the values of $i.

Change the upper bound of the for loop to 100 instead of 20, and you'll find it only runs from 0 to 29. Because of PHP's max_execution_time the script times out and dies.

2 Comments

Nice answer. But, instead of writing to a file, what if the script tries to flush output to the browser - e.g., using echo "..."; flush(); ob_flush();? I think the script would then terminate as the output stream would no longer be available.
@ban-geoengineering I first thought you were correct but I tested your suggestion (the above code already uses ´echo´ every second) but on my system the script did not abort when the tab was closed. Strange.
2

if the script is completely server based (no feedback to the user) this will be done even if the client is closed.

The general architecture of PHP is that a clients send a request to a script that gives a reply to the user. if nothing is given back to the user the script will still execute even if the user is not on the other side anymore. More simpler: their is no constant connection between server and client on a regular script.

Comments

1

You can make the PHP script run every 20 minutes using a crontab file which contains the time and what command to run in this case it would be the php script.

Comments

0

Yes. The server doesn't know if the user closed the browser. At least it doesn't notice that immediately.

No: the server probably (depending of how it is configured) won't allow for a php script to run for 10 minutes. On a cheap shared hosting I wouldn't rely on a script running for longer than a reasonable response time.

Comments

0

From the documentation:

The default behaviour is however for your script to be aborted when the remote client disconnects. This behaviour can be set via the ignore_user_abort php.ini directive as well as through the corresponding php_value ignore_user_abort Apache httpd.conf directive or with the ignore_user_abort() function.

So by default, PHP will not continue running after the user closes the browser.

To change this on a one-off basis, use:

ignore_user_abort(true);

To change the default behavior for the entire application, add to your php.ini file:

ignore_user_abort = On

The documentation page also mentions that a function registered with the register_shutdown_function() function will always be run, and a timeout can also cause a script to end early. See the documentation for more info on those scenarios.

For long running tasks

In addition to timeouts, remember that there are other things that could interrupt your script such as hitting resource limitations or the server or Apache/Nginx getting restarted. For tasks that take hours to run, having to restart such a long-running task could be quite annoying. If you have such a long running task, consider looking for ways of breaking the task down into smaller tasks that can be queued up and then executed by code that is started from a cron job, or by a service listening to events from a message broker, or by a background daemon. See best ways to manage long running PHP scripts.

Comments

-1

A server-side script will go on what it is doing regardless of what the client is doing.

EDIT: By the way, are you sure that you want to have pages that take 10 minutes to open? I suggest you to employ a task queue (whose items are executed by cron on a timely basis) and redirect user to a "ok, I am on it" page.

4 Comments

... not totally true: usually, there are guards to "kill" long-running php scripts: these take the form of crontab entries.
@jldupont: I am no php guy, but does these scripts have anything to do with the client?
@shanyu - not sure what jldupont is talking about. There are configuration variables for php that can kill long-running scripts, but they have absolutely nothing to do with cron.
This is not true, a server-side script does not always carry on going regardless of client...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.