2

So I have a little bash script:

for store in {4..80}
do
  for page in {1..200} 
  do
    curl 'https://website.com/shop.php?store_id='$store'&page='$page'&PHPSESSID=sessionID' -O
  done
done

The script is working but downloads all 200 store pages of all stores from 4-80 one after another, which takes a lot of time. (Notice the bash variable store and page in the curl URL)

My goal would be to run as many curl requests simultaneously for each store/page instead getting it worked on one after one, to save time.

Is this possible?

2
  • Append one space and & to your curl command? Commented Nov 2, 2021 at 20:49
  • keep in mind that 'just' adding ` &` on the end of the curl command will spawn upwards of 15K+ concurrent curl calls (in reality your OS will probably choke/periodically-hang and/or some calls will complete before you get to the end of the 15K+ calls); even if your OS can juggle 15K+ concurrent curl calls chances are your network and/or disk are going to turn into bottlenecks (ie, excessive thrashing will degrade overall performance); so you'll want to look at putting a limit on the number of concurrent curl calls you have outstanding; (google) search on bash limit number of jobs Commented Nov 2, 2021 at 21:05

2 Answers 2

7

curl can run loops itself. This limits curl to 100 simultaneously sessions:

curl 'https://website.com/shop.php?store_id=[4-80]&page=[1-200]&PHPSESSID=sessionID' -O --parallel --parallel-max 100
Sign up to request clarification or add additional context in comments.

Comments

1

Try changing you script as follow:

for store in {4..80}; do
  for page in {1..200}; do
    curl 'https://website.com/shop.php?store_id='$store'&page='$page'&PHPSESSID=sessionID' -O &
  done
done

# Wait for spawned cURLs...
wait

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.