i run php from bash, and looping as much line in text file.
this is my code
#!/bin/bash
cat data.txt | while read line
do
echo 'scrape: '$line
php index.php $line >> output.csv
done
how i can run Simultaneously, say 10 thread per run.
You can easily do this with sem from GNU parallel:
#!/bin/bash
cat data.txt | while read line
do
echo 'scrape: '$line
sem -j 10 php index.php $line >> output.csv
done
However, it's up to you to ensure that the output makes sense when written in parallel to the same file. You may want to write to different files and join them afterwards.
output.csv to something likeoutput$((i++)).csvWhile that other guy's answer is correct, sem is quite slow (300 ms per job), and can in this case it can be replaced by a single call to GNU Parallel (which takes around 300 ms in startup and 10 ms per job):
parallel -j10 "echo scrape: {}; php index.php {}" :::: data.txt > output.csv
For one file per job:
parallel -j10 "(echo scrape: {}; php index.php {}) > output{#}.csv" :::: data.txt
&at the end of the line, or you could look at a tool like GNU parallel