1

What I need to do is process multiple requests that are pending in a database, using PHP.

What I'm currently trying to do is: When my cronjob runs. I want to call a file "process_a.php" 10 times, instantly, without waiting for it to finish processing (process_a.php could take a few minutes).

I tried to do this using CURL. But right when my cronjob calls process_a.php, it waits for it to finish processing and return before calling another process_a.php file.

I even tried putting code in process_a.php to close the connection instantly, then continue processing in the background. but the cronjob still waited for it to finish.

I just want the same file being executed 10 times at once, you know, like if 10 different users were to request the index.php page of my website... Any ideas!?

1

3 Answers 3

2

As told by @Brad curl-multi-exec should be an option.

http://php.net/manual/en/function.curl-multi-exec.php

    <?php
//create the multiple cURL handle
$mh = curl_multi_init();

// create both cURL resources
for($i = 0; $i < 10;$i++){
     $ch[$i] = curl_init();
     curl_setopt($ch[$i], CURLOPT_URL, "http://urhost//path/to/process_a.php");
     curl_setopt($ch[$i], CURLOPT_HEADER, 0);
     curl_multi_add_handle($mh,$ch[$i]);//add the handles
}

$active = null;

//execute the handles
do {
    $mrc = curl_multi_exec($mh, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);

while ($active && $mrc == CURLM_OK) {
    if (curl_multi_select($mh) != -1) {
        do {
            $mrc = curl_multi_exec($mh, $active);
        } while ($mrc == CURLM_CALL_MULTI_PERFORM);
    }
}

//close the handles
for($i = 0; $i < 10;$i++){
     curl_multi_remove_handle($mh, $ch[$i]);
}
curl_multi_close($mh);

?>

I tested this script by calling another script below :

<?php 
print microtime();//Return current Unix timestamp with microseconds 
print '<br>';
?>

and here are the results, each handle differing by microseconds in execution time.

0.27085300 1340214659
0.44853600 1340214659
0.46611800 1340214659
0.48201000 1340214659
0.50209400 1340214659
0.48233900 1340214659
0.52274300 1340214659
0.54757800 1340214659
0.57316900 1340214659
0.59475800 1340214659
Sign up to request clarification or add additional context in comments.

10 Comments

What is the path that you gave? If you gave the full URL, localhost/path/to/script.php how will it know if this is the same server or not.? Also the way to test this is by printing the time in milliseconds at the top of process_a.php, then you can check if the pages are called one at a time or not.
I think my server is detecting multiple requests from the same user, and doesn't allow it. Like if I load the index page 5 times in my browser, they load 1 by 1. It's like a session thing? idk D: I'll check to make sure it's not some other problem though...
an easy test would be create a time.php with contents : <?php print microtime();//Return current Unix timestamp with microseconds ?> and call this script with curl multi handle.. check the time difference between different handles...
@StephenSarcsamKamenar : Checkout my edited answer with test results.
Interesting. When I try this from my cronjob, calling my process_a.php file, they execute 1 after the other with about 10 second delay. When I run a simple test like you showed, it works perfect... I'll test some more and let you know when I have more information as to why my cronjob version isn't working right...
|
2

Fork into 10 processes:

 <?php
 for($i = 0; $i < 10;$i++){
     $pid = pcntl_fork():
     if ($pid == -1) {
          trigger_error('could not fork');
     } else if (!$pid) {
          //we are the child now
           require 'process_a.php';
     } else {
         // we are the parent, you could do something here if need be.
     }
 }

... but that process_a.php could do anything your website does, so, rather then calling a page, why not do the actual work the page request would result in? And let the webserver just continue to be, you know, a webserver instead of a bloated script-repo.

7 Comments

pcntl_fork isn't available by default. Do you think it's worth installing? And would this work if I have 7 different files, that I want to each call 10 times?
I'm using CURL because the processing logic might be on a different server at somepoint.
I have never seen a server which DID have cronjobs, but didn't have the capability of pcntl_fork. If you need to install this seperately, yes, I'd say it was worth it. If you want 10 x 7 different files, sure, this would work too (but, fork 70 times then).
If you are preparing for a really distributed application, I would rather look at GearMan (which can even pool servers) then curl solutions (but that entails more installs).
Installation Process Control support in PHP is not enabled by default. You have to compile the CGI or CLI version of PHP with --enable-pcntl configuration option when compiling PHP to enable Process Control support. Note: Currently, this module will not function on non-Unix platforms (Windows). php.net/manual/en/ref.pcntl.php
|
1

Do you have full cron available, or are you only able to specify php files?

You might be able to use xargs to fork 10 processes with the -P argument

seq `1 10`|xargs -n1 -P10 php /path/to/file.php

1 Comment

What does this mean/do? Are you saying I should put this instead of calling my cronjob.php? I need cronjob.php to run to decide what requests should be processed at this time.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.