0

I have a piece of php code that executes multiple curl request using the exec() function. The goal is to download a lot of files simultaneously, at the same keep track of the PIDs as well as the progress. A simplified version is like this:

<?php
// some code here
unset($pid);
for($i=1;$i<=100;$i++) {
    exec("nohup curl '".$url[$i]."' -o output-".$i.".dat > log/".$i.".log & echo $!",$pid[$i]);
}
// some code here
?>

where

$url[] is an array containing a lot of urls

$pid[] is an array containing the PID of the curl process. I need this to check if the process is finished, then perform other tasks.

output-i.dat is the downloaded files

log/i.log is a text file containing the process generated by curl in cli. I need this to make sure the file is 100% downloaded and connection is not lost mid-way

The reason I need to use nohup is to obtain the PID, without nohup I cannot get the PID from echo $!

This script works and achieved what I need however when I run the code in cli php download.php the screen will be flooded with

nohup: redirecting stderr to stdout
nohup: redirecting stderr to stdout
......
nohup: redirecting stderr to stdout
nohup: redirecting stderr to stdout

I want to know if there is anyway to pipe this output to /dev/null

I tried to include > /dev/null 2>&1 in the php like this

exec("nohup curl '".$url[$i]."' -o output-".$i.".dat > log/".$i.".log > /dev/null 2>&1 & echo $!",$pid[$i]);

but it does not work. Neither will this work:

exec("nohup curl '".$url[$i]."' -o output-".$i.".dat > log/".$i.".log 2>/dev/null & echo $!",$pid[$i]);

I was hoping there is a quiet switch for nohup but it does not seem to have one.

3
  • 1
    Just to suggest an alternative, you have curl_multi_init(). Commented Sep 14, 2023 at 10:26
  • Thanks for the suggestion. It may work but from the php documentation curl_multi_exec will wait until all individual curl handles completed before moving on. In my case I want to have a new instance start once an instant completed. Imagine having to download 10000 files, curl_multi_exec will have a problem if 1 link is very slow because all the other 99 instances will be idle. By calling them in exec() I can immediately start a new instance when one is completed. Commented Sep 14, 2023 at 10:59
  • 1
    I don't have personal experience with it but according to docs that is not how it works. The only possible drawback may be that your script cannot end or abort until it's all done, while nohup commands will run in the background no matter what. Commented Sep 14, 2023 at 11:45

1 Answer 1

0

This gets the PID, output and hides the nohup: redirecting stderr to stdout message:


$o = array();
$output = exec("nohup curl -h 2> /dev/null & echo $!", $o);

$pid = $o[0];

This gets the PID, writes the status to log file and hides the nohup message:


$o = array();
exec("nohup curl -h > log.log 2> /dev/null & echo $!", $o);

$pid = $o[0];

This gets the PID, writes the output to file, the status to log and hides the nohup message:


$o = array();
exec("nohup curl example.com -o output --stderr log.log 2> /dev/null & echo $!", $o);

$pid = $o[0];

And finally, to integrate it with your code:

$i = 0;
$url = array('example.com');
$pid = array();

$o = array();
exec("nohup curl '$url[$i]' -o output-$i.dat --stderr log-$i.log 2> /dev/null & echo $!", $o);

$pid[$i] = $o[0];
Sign up to request clarification or add additional context in comments.

3 Comments

Thanks. Let me try it and see how it behaves.
I've updated the answer, I think the last option might be most suited to what you want.
Just tried your last method. It runs without the nohup messages BUT the curl commands were executed sequentially (one only starts after the previous download is completed) instead of having 100 instances running simultaneously even though the & is present. Perhaps the stderr cannot be redirected to multiple files at the same time thats why the next command need to wait until stderr is free again before it can execute? Unfortunately it does not work for me.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.