2

i run php from bash, and looping as much line in text file.

this is my code

#!/bin/bash
cat data.txt | while read line
do
   echo 'scrape: '$line
   php index.php $line >> output.csv
done

how i can run Simultaneously, say 10 thread per run.

2
  • 3
    bash doesn't really have threads, but you can run processes in the background by putting & at the end of the line, or you could look at a tool like GNU parallel Commented Jul 19, 2017 at 0:12
  • 1
    You're really pushing the limits of bash. You would have a much easier time using a proper scripting language like python or ruby for this Commented Jul 19, 2017 at 0:21

2 Answers 2

3

You can easily do this with sem from GNU parallel:

#!/bin/bash
cat data.txt | while read line
do
   echo 'scrape: '$line
   sem -j 10 php index.php $line >> output.csv
done

However, it's up to you to ensure that the output makes sense when written in parallel to the same file. You may want to write to different files and join them afterwards.

Sign up to request clarification or add additional context in comments.

1 Comment

Change output.csv to something likeoutput$((i++)).csv
1

While that other guy's answer is correct, sem is quite slow (300 ms per job), and can in this case it can be replaced by a single call to GNU Parallel (which takes around 300 ms in startup and 10 ms per job):

parallel -j10 "echo scrape: {}; php index.php {}" :::: data.txt > output.csv

For one file per job:

parallel -j10 "(echo scrape: {}; php index.php {}) > output{#}.csv" :::: data.txt

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.